Jan 31 04:26:30 crc systemd[1]: Starting Kubernetes Kubelet... Jan 31 04:26:30 crc restorecon[4682]: Relabeled /var/lib/kubelet/config.json from system_u:object_r:unlabeled_t:s0 to system_u:object_r:container_var_lib_t:s0 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/device-plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/device-plugins/kubelet.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/volumes/kubernetes.io~configmap/nginx-conf/..2025_02_23_05_40_35.4114275528/nginx.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/22e96971 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/21c98286 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8/containers/networking-console-plugin/0f1869e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c15,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/46889d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/5b6a5969 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/setup/6c7921f5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4804f443 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/2a46b283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/a6b5573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/4f88ee5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c225,c458 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/5a4eee4b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c963 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/d1b160f5dda77d281dd8e69ec8d817f9/containers/kube-rbac-proxy-crio/cd87c521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c215,c682 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_33_42.2574241751/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/38602af4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/1483b002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/0346718b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/d3ed4ada not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/3bb473a5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/8cd075a9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/00ab4760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/containers/router/54a21c09 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/70478888 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/43802770 not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/955a0edc not reset as customized by admin to system_u:object_r:container_file_t:s0:c176,c499 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/bca2d009 not reset as customized by admin to system_u:object_r:container_file_t:s0:c140,c1009 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/37a5e44f-9a88-4405-be8a-b645485e7312/containers/network-operator/b295f9bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c589,c726 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..2025_02_23_05_21_22.3617465230/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-binary-copy/cnibincopy.sh not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..2025_02_23_05_21_22.2050650026/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes/kubernetes.io~configmap/cni-sysctl-allowlist/allowlist.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/bc46ea27 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5731fc1b not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/egress-router-binary-copy/5e1b2a3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/943f0936 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/3f764ee4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/cni-plugins/8695e3f9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/aed7aa86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/c64d7448 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/bond-cni-plugin/0ba16bd2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/207a939f not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/54aa8cdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/routeoverride-cni/1f5fa595 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/bf9c8153 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/47fba4ea not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni-bincopy/7ae55ce9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7906a268 not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/ce43fa69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/whereabouts-cni/7fc7ea3a not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/d8c38b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c203,c924 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/9ef015fb not reset as customized by admin to system_u:object_r:container_file_t:s0:c138,c778 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/containers/kube-multus-additional-cni-plugins/b9db6a41 not reset as customized by admin to system_u:object_r:container_file_t:s0:c574,c582 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/b1733d79 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/afccd338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/network-metrics-daemon/9df0a185 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/18938cf8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c476,c820 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/7ab4eb23 not reset as customized by admin to system_u:object_r:container_file_t:s0:c272,c818 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/containers/kube-rbac-proxy/56930be6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c432,c991 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_35.630010865 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..2025_02_23_05_21_35.1088506337/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes/kubernetes.io~configmap/ovnkube-config/ovnkube.conf not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/0d8e3722 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/d22b2e76 not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/kube-rbac-proxy/e036759f not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/2734c483 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/57878fe7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/3f3c2e58 not reset as customized by admin to system_u:object_r:container_file_t:s0:c89,c211 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/375bec3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c382,c850 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/containers/ovnkube-cluster-manager/7bc41e08 not reset as customized by admin to system_u:object_r:container_file_t:s0:c440,c975 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/48c7a72d not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/4b66701f not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/containers/download-server/a5a1c202 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..2025_02_23_05_21_40.3350632666/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-cert-acceptance-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/ovnkube-identity-cm/additional-pod-admission-cond.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..2025_02_23_05_21_40.1388695756 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/volumes/kubernetes.io~configmap/env-overrides/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/26f3df5b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/6d8fb21d not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/webhook/50e94777 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208473b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/ec9e08ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3b787c39 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/208eaed5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/93aa3a2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/ef543e1b-8068-4ea3-b32a-61027b32e95d/containers/approver/3c697968 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/ba950ec9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/cb5cdb37 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3b6479f0-333b-4a96-9adf-2099afdc2447/containers/network-check-target-container/f2df9827 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..2025_02_23_05_22_30.473230615/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_24_06_22_02.1904938450/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/fedaa673 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/9ca2df95 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/machine-config-operator/b2d7460e not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2207853c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/241c1c29 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/containers/kube-rbac-proxy/2d910eaf not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/..2025_02_23_05_23_49.3726007728/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/..2025_02_23_05_23_49.841175008/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/etcd-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178 not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.843437178/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/c6c0f2e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/399edc97 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8049f7cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/0cec5484 not reset as customized by admin to system_u:object_r:container_file_t:s0:c263,c871 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/312446d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c406,c828 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/containers/etcd-operator/8e56a35d not reset as customized by admin to system_u:object_r:container_file_t:s0:c84,c419 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.133159589/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/2d30ddb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/eca8053d not reset as customized by admin to system_u:object_r:container_file_t:s0:c380,c909 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/c3a25c9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c168,c522 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/containers/kube-controller-manager-operator/b9609c22 not reset as customized by admin to system_u:object_r:container_file_t:s0:c108,c511 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/e8b0eca9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/b36a9c3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/dns-operator/38af7b07 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/ae821620 not reset as customized by admin to system_u:object_r:container_file_t:s0:c106,c418 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/baa23338 not reset as customized by admin to system_u:object_r:container_file_t:s0:c529,c711 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/containers/kube-rbac-proxy/2c534809 not reset as customized by admin to system_u:object_r:container_file_t:s0:c968,c969 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3532625537/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/59b29eae not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/c91a8e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c381 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/4d87494a not reset as customized by admin to system_u:object_r:container_file_t:s0:c442,c857 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/containers/kube-scheduler-operator-container/1e33ca63 not reset as customized by admin to system_u:object_r:container_file_t:s0:c661,c999 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/8dea7be2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d0b04a99 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/kube-rbac-proxy/d84f01e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/4109059b not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/a7258a3e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/containers/package-server-manager/05bdf2b6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/f3261b51 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/315d045e not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/5fdcf278 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/d053f757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/containers/control-plane-machine-set-operator/c2850dc7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..2025_02_23_05_22_30.2390596521/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes/kubernetes.io~configmap/marketplace-trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fcfb0b2b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c7ac9b7d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/fa0c0d52 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/c609b6ba not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/2be6c296 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/89a32653 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/4eb9afeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/containers/marketplace-operator/13af6efa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/b03f9724 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/e3d105cc not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/containers/olm-operator/3aed4d83 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1906041176/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/0765fa6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/2cefc627 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/3dcc6345 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/containers/kube-storage-version-migrator-operator/365af391 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-SelfManagedHA-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-TechPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-DevPreviewNoUpgrade.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes/kubernetes.io~empty-dir/available-featuregates/featureGate-Hypershift-Default.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b1130c0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/236a5913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-api/b9432e26 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/5ddb0e3f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/986dc4fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/8a23ff9a not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/9728ae68 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/containers/openshift-config-operator/665f31d0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c12 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1255385357/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/..2025_02_23_05_23_57.573792656/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/service-ca-bundle/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_22_30.3254245399/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes/kubernetes.io~configmap/trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/136c9b42 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/98a1575b not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/cac69136 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/5deb77a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/containers/authentication-operator/2ae53400 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3608339744/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes/kubernetes.io~configmap/config/operator-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/e46f2326 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/dc688d3c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/3497c3cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/containers/service-ca-operator/177eb008 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.3819292994/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/af5a2afa not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/d780cb1f not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/49b0f374 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/containers/openshift-apiserver-operator/26fbb125 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.3244779536/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/cf14125a not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/b7f86972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/e51d739c not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/88ba6a69 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/669a9acf not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/5cd51231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/75349ec7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/15c26839 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/45023dcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/ingress-operator/2bb66a50 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/64d03bdd not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/ab8e7ca0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/containers/kube-rbac-proxy/bb9be25f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c11 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_22_30.2034221258/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/9a0b61d3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/d471b9d2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/containers/cluster-image-registry-operator/8cb76b8e not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/11a00840 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/ec355a92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/containers/catalog-operator/992f735e not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..2025_02_23_05_22_30.1782968797/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d59cdbbc not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/72133ff0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/c56c834c not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/d13724c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/containers/openshift-controller-manager-operator/0a498258 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c14 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa471982 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fc900d92 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/containers/machine-config-server/fa7d68da not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/4bacf9b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/424021b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/migrator/fc2e31a3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/f51eefac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/c8997f2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/containers/graceful-termination/7481f599 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..2025_02_23_05_22_49.2255460704/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes/kubernetes.io~configmap/signing-cabundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/fdafea19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/d0e1c571 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/ee398915 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/containers/service-ca-controller/682bb6b8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a3e67855 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/a989f289 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/setup/915431bd not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/7796fdab not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/dcdb5f19 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-ensure-env-vars/a3aaa88c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/5508e3e6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/160585de not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-resources-copy/e99f8da3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/8bc85570 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/a5861c91 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcdctl/84db1135 not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/9e1a6043 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/c1aba1c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd/d55ccd6d not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/971cc9f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/8f2e3dcf not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-metrics/ceb35e9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/1c192745 not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/5209e501 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-readyz/f83de4df not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/e7b978ac not reset as customized by admin to system_u:object_r:container_file_t:s0:c294,c884 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/c64304a1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c1016 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/2139d3e2895fc6797b9c76a1b4c9886d/containers/etcd-rev/5384386b not reset as customized by admin to system_u:object_r:container_file_t:s0:c666,c920 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/cce3e3ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/multus-admission-controller/8fb75465 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/740f573e not reset as customized by admin to system_u:object_r:container_file_t:s0:c435,c756 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/containers/kube-rbac-proxy/32fd1134 not reset as customized by admin to system_u:object_r:container_file_t:s0:c268,c620 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/0a861bd3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/80363026 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/containers/serve-healthcheck-canary/bfa952a8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c19,c24 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..2025_02_23_05_33_31.2122464563/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..2025_02_23_05_33_31.333075221 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/793bf43d not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/7db1bb6e not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/kube-rbac-proxy/4f6a0368 not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/c12c7d86 not reset as customized by admin to system_u:object_r:container_file_t:s0:c381,c387 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/36c4a773 not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/4c1e98ae not reset as customized by admin to system_u:object_r:container_file_t:s0:c142,c438 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/containers/machine-approver-controller/a4c8115c not reset as customized by admin to system_u:object_r:container_file_t:s0:c129,c158 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/setup/7db1802e not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver/a008a7ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-syncer/2c836bac not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-cert-regeneration-controller/0ce62299 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-insecure-readyz/945d2457 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/containers/kube-apiserver-check-endpoints/7d5c1dd8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c97,c980 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/advanced-cluster-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-broker-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq-streams-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amq7-interconnect-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-automation-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ansible-cloud-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry-3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bamoe-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/index.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/businessautomation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cephcsi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cincinnati-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-kube-descheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/compliance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/container-security-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/costmanagement-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cryostat-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datagrid/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devspaces/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devworkspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dpu-network-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eap/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/file-integrity-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-console/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fuse-online/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gatekeeper-operator-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jws-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kernel-module-management-hub/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kiali-ossm/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logic-operator-rhel8/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lvms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mcg-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mta-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mtv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-client-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-csi-addons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-multicluster-orchestrator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odf-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odr-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/bundle-v1.15.0.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/channel.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-cert-manager-operator/package.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-custom-metrics-autoscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-pipelines-operator-rh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-secondary-scheduler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-bridge-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/quay-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/recipe/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/red-hat-hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redhat-oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rh-service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhacs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhbk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhdh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhods-prometheus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhpam-kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhsso-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rook-ceph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/run-once-duration-override-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sandboxed-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/security-profiles-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/serverless-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-registry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/servicemeshoperator3/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/submariner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tang-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustee-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volsync-product/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/catalog/web-terminal/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/bc8d0691 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/6b76097a not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-utilities/34d1af30 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/312ba61c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/645d5dd1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/extract-content/16e825f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/4cf51fc9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/2a23d348 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/containers/registry-server/075dbd49 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/..2025_02_24_06_09_13.3521195566/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes/kubernetes.io~configmap/serviceca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/dd585ddd not reset as customized by admin to system_u:object_r:container_file_t:s0:c377,c642 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/17ebd0ab not reset as customized by admin to system_u:object_r:container_file_t:s0:c338,c343 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/containers/node-ca/005579f4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c842,c986 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_23_05_23_11.449897510/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_23_05_23_11.1287037894 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..2025_02_23_05_23_11.1301053334/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes/kubernetes.io~configmap/audit-policies/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/bf5f3b9c not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/af276eb7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/fix-audit-permissions/ea28e322 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/692e6683 not reset as customized by admin to system_u:object_r:container_file_t:s0:c49,c263 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/871746a7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c701 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/containers/oauth-apiserver/4eb2e958 not reset as customized by admin to system_u:object_r:container_file_t:s0:c764,c897 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..2025_02_24_06_09_06.2875086261/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/console-config/console-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_09_06.286118152/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..2025_02_24_06_09_06.3865795478/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/oauth-serving-cert/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..2025_02_24_06_09_06.584414814/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/ca9b62da not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/containers/console/0edd6fce not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.2406383837/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.openshift-global-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/config/openshift-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.1071801880/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877 not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..2025_02_24_06_20_07.2494444877/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes/kubernetes.io~configmap/proxy-ca-bundles/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/containers/controller-manager/89b4555f not reset as customized by admin to system_u:object_r:container_file_t:s0:c14,c22 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..2025_02_23_05_23_22.4071100442/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes/kubernetes.io~configmap/config-volume/Corefile not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/655fcd71 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/0d43c002 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/dns/e68efd17 not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/9acf9b65 not reset as customized by admin to system_u:object_r:container_file_t:s0:c457,c841 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/5ae3ff11 not reset as customized by admin to system_u:object_r:container_file_t:s0:c55,c1022 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/containers/kube-rbac-proxy/1e59206a not reset as customized by admin to system_u:object_r:container_file_t:s0:c466,c972 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/27af16d1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c304,c1017 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/7918e729 not reset as customized by admin to system_u:object_r:container_file_t:s0:c853,c893 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/containers/dns-node-resolver/5d976d0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c585,c981 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..2025_02_23_05_38_56.1112187283/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/config/controller-config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_23_05_38_56.2839772658/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes/kubernetes.io~configmap/trusted-ca/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/d7f55cbb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/f0812073 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/1a56cbeb not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/7fdd437e not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/containers/console-operator/cdfb5652 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c25 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..2025_02_24_06_17_29.3844392896/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/etcd-serving-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..2025_02_24_06_17_29.848549803/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..2025_02_24_06_17_29.780046231/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/audit/policy.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..2025_02_24_06_17_29.2926008347/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/image-import-ca/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..2025_02_24_06_17_29.2729721485/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes/kubernetes.io~configmap/trusted-ca-bundle/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/fix-audit-permissions/fb93119e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver/f1e8fc0e not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/containers/openshift-apiserver-check-endpoints/218511f3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c336,c787 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes/kubernetes.io~empty-dir/tmpfs/k8s-webhook-server/serving-certs not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/ca8af7b3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/72cc8a75 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/containers/packageserver/6e8a3760 not reset as customized by admin to system_u:object_r:container_file_t:s0:c12,c18 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..2025_02_23_05_27_30.557428972/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes/kubernetes.io~configmap/service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4c3455c0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/2278acb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/4b453e4f not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/containers/cluster-version-operator/3ec09bda not reset as customized by admin to system_u:object_r:container_file_t:s0:c5,c6 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..2025_02_24_06_25_03.422633132/anchors/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/trusted-ca/anchors not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..2025_02_24_06_25_03.3594477318/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/image-registry.openshift-image-registry.svc.cluster.local..5000 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~configmap/registry-certificates/default-route-openshift-image-registry.apps-crc.testing not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/edk2/cacerts.bin not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/java/cacerts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/openssl/ca-bundle.trust.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/tls-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/email-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/objsign-ca-bundle.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2ae6433e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fde84897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75680d2e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/openshift-service-serving-signer_1740288168.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/facfc4fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f5a969c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CFCA_EV_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9ef4a08a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ingress-operator_1740288202.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2f332aed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/248c8271.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d10a21f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ACCVRAIZ1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a94d09e5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c9a4d3b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40193066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd8c0d63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b936d1c6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CA_Disig_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4fd49c6c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AC_RAIZ_FNMT-RCM_SERVIDORES_SEGUROS.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b81b93f0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f9a69fa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b30d5fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ANF_Secure_Server_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b433981b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93851c9e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9282e51c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7dd1bc4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Actalis_Authentication_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/930ac5d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f47b495.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e113c810.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5931b5bc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Commercial.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2b349938.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e48193cf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/302904dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a716d4ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Networking.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/93bc0acc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/86212b19.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certigna_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b727005e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbc54cab.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f51bb24c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c28a8a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AffirmTrust_Premium_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9c8dfbd4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ccc52f49.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cb1c3204.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ce5e74ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd08c599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6d41d539.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb5fa911.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e35234b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8cb5ee0f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a7c655d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f8fc53da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Amazon_Root_CA_4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/de6d66f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d41b5e2a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/41a3f684.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1df5a75f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_2011.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e36a6752.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b872f2b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9576d26b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/228f89db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_ECC_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fb717492.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d21b73c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b1b94ef.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/595e996b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Atos_TrustedRoot_Root_CA_RSA_TLS_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b46e03d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/128f4b91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_3_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81f2d2b1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Autoridad_de_Certificacion_Firmaprofesional_CIF_A62634068.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3bde41ac.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d16a5865.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_EC-384_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0179095f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ffa7f1eb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9482e63a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4dae3dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/BJCA_Global_Root_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e359ba6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7e067d03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/95aff9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7746a63.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Baltimore_CyberTrust_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/653b494a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3ad48a91.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Buypass_Class_2_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/54657681.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/82223c44.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8de2f56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2d9dafe4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d96b65e2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee64a828.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/40547a79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5a3f0ff8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a780d93.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/34d996fb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/eed8c118.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/89c02a45.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b1159c4c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/COMODO_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d6325660.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d4c339cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8312c4c1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certainly_Root_E1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8508e720.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5fdd185d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48bec511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/69105f4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0b9bc432.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Certum_Trusted_Network_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/32888f65.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b03dec0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/219d9499.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_ECC_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5acf816d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbf06781.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-01.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc99f41e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/CommScope_Public_Trust_RSA_Root-02.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/AAA_Certificate_Services.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/985c1f52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8794b4e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_BR_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e7c037b4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ef954a4e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_EV_Root_CA_1_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2add47b6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/90c5a3c8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0f3e76e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/53a1b57a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/D-TRUST_Root_Class_3_CA_2_EV_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5ad8a5d6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/68dd7389.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d04f354.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d6437c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/062cdee6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bd43e1dd.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Assured_ID_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7f3d5d1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c491639e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3513523f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/399e7759.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/feffd413.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d18e9066.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/607986c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c90bc37d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1b0f7e5c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e08bfd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Global_Root_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dd8e9d41.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed39abd0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a3418fda.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bc3f2570.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_High_Assurance_EV_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/244b5494.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/81b9768f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4be590e0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_ECC_P384_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9846683b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/252252d2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e8e7201.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_TLS_RSA4096_Root_G5.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d52c538d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c44cc0c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/DigiCert_Trusted_Root_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/75d1b2ed.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a2c66da8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ecccd8db.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust.net_Certification_Authority__2048_.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/aee5f10d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3e7271e8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0e59380.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4c3982f2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b99d060.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf64f35b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0a775a30.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/002c0b4f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cc450945.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_EC1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/106f3e4d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b3fb433b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GlobalSign.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4042bcee.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/02265526.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/455f1b52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0d69c7e1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9f727ac7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Entrust_Root_Certification_Authority_-_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5e98733a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0cd152c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dc4d6a89.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6187b673.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/FIRMAPROFESIONAL_CA_ROOT-A_WEB.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ba8887ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/068570d1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f081611a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/48a195d8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GDCA_TrustAUTH_R5_ROOT.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f6fa695.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab59055e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b92fd57f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GLOBALTRUST_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fa5da96b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ec40989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7719f463.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/GTS_Root_R1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1001acf7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f013ecaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/626dceaf.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c559d742.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1d3472b9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9479c8c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a81e292b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4bfab552.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_E46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Go_Daddy_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e071171e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/57bcb2da.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_ECC_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ab5346f4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5046c355.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HARICA_TLS_RSA_Root_CA_2021.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/865fbdf9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da0cfd1d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/85cde254.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_ECC_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cbb3f32b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureSign_RootCA11.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hellenic_Academic_and_Research_Institutions_RootCA_2015.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5860aaa6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/31188b5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/HiPKI_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c7f1359b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f15c80c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Hongkong_Post_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/09789157.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ISRG_Root_X2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/18856ac4.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e09d511.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Commercial_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cf701eeb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d06393bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/IdenTrust_Public_Sector_Root_CA_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/10531352.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Izenpe.com.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SecureTrust_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b0ed035a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsec_e-Szigno_Root_CA_2009.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8160b96c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e8651083.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2c63f966.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_ECC_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d89cda1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/01419da9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_RSA_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7a5b843.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Microsoft_RSA_Root_Certificate_Authority_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bf53fb88.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9591a472.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3afde786.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Gold_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NAVER_Global_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3fb36b73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d39b0a2c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a89d74c2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/cd58d51e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b7db1890.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/NetLock_Arany__Class_Gold__F__tan__s__tv__ny.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/988a38cb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/60afe812.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f39fc864.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5443e9e3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GB_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e73d606e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dfc0fe80.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b66938e9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1e1eab7c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/OISTE_WISeKey_Global_Root_GC_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/773e07ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c899c73.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d59297b8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ddcda989.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_1_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/749e9e03.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/52b525c7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_RootCA3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d7e8dc79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a819ef2.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/08063a00.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6b483515.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_2_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/064e0aa9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1f58a078.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6f7454b3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7fa05551.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76faf6c0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9339512a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f387163d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee37c333.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/QuoVadis_Root_CA_3_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e18bfb83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e442e424.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fe8a2cd8.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/23f4c490.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5cd81ad7.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f0c70a8d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7892ad52.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SZAFIR_ROOT_CA2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4f316efb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_EV_Root_Certification_Authority_RSA_R2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/06dc52d5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/583d0756.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Sectigo_Public_Server_Authentication_Root_R46.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:30 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_ECC.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0bf05006.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/88950faa.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9046744a.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/3c860d51.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_Root_Certification_Authority_RSA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/6fa5da56.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/33ee480d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Secure_Global_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/63a2c897.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SSL.com_TLS_ECC_Root_CA_2022.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/bdacca6f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ff34af3f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/dbff3a01.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Security_Communication_ECC_RootCA1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_C1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Class_2_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/406c9bb1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_C3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Starfield_Services_Root_Certificate_Authority_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/SwissSign_Silver_CA_-_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/99e1b953.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/T-TeleSec_GlobalRoot_Class_3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/14bc7599.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TUBITAK_Kamu_SM_SSL_Kok_Sertifikasi_-_Surum_1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Global_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/7a3adc42.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TWCA_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f459871d.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_ECC_Root_2020.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_Root_CA_-_G1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telekom_Security_TLS_RSA_Root_2023.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TeliaSonera_Root_CA_v1.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Telia_Root_CA_v2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8f103249.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f058632f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-certificates.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9bf03295.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/98aaf404.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TrustAsia_Global_Root_CA_G4.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1cef98f5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/073bfcc5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/2923b3f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f249de83.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/edcbddb5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/emSign_ECC_Root_CA_-_G3.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P256_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9b5697b0.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/1ae85e5e.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/b74d2bd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/Trustwave_Global_ECC_P384_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/d887a5bb.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9aef356c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/TunTrust_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fd64f3fc.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e13665f9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Extended_Validation_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/0f5dc4f3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/da7377f6.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/UCA_Global_G2_Root.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/c01eb047.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/304d27c3.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ed858448.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_ECC_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/f30dd6ad.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/04f60c28.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/vTrus_ECC_Root_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/USERTrust_RSA_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/fc5a8f99.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/35105088.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ee532fd5.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/XRamp_Global_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/706f604c.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/76579174.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/8d86cdd1.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/882de061.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/certSIGN_ROOT_CA_G2.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/5f618aec.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/a9d40e02.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e-Szigno_Root_CA_2017.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/e868b802.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/83e9984f.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ePKI_Root_Certification_Authority.pem not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/ca6e4ad9.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/9d6523ce.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/4b718d9b.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes/kubernetes.io~empty-dir/ca-trust-extracted/pem/directory-hash/869fbf79.0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/containers/registry/f8d22bdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c10,c16 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/6e8bbfac not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/54dd7996 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator/a4f1bb05 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/207129da not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/c1df39e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/containers/cluster-samples-operator-watch/15b8f1cd not reset as customized by admin to system_u:object_r:container_file_t:s0:c9,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3523263858/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..2025_02_23_05_27_49.3256605594/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes/kubernetes.io~configmap/images/images.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/77bd6913 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/2382c1b1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/kube-rbac-proxy/704ce128 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/70d16fe0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/bfb95535 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/containers/machine-api-operator/57a8e8e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c0,c15 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..2025_02_23_05_27_49.3413793711/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/1b9d3e5e not reset as customized by admin to system_u:object_r:container_file_t:s0:c107,c917 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/fddb173c not reset as customized by admin to system_u:object_r:container_file_t:s0:c202,c983 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/containers/kube-apiserver-operator/95d3c6c4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c219,c404 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/bfb5fff5 not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/2aef40aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/9d751cbb-f2e2-430d-9754-c882a5e924a5/containers/check-endpoints/c0391cad not reset as customized by admin to system_u:object_r:container_file_t:s0:c20,c21 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/1119e69d not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/660608b4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager/8220bd53 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/85f99d5c not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/cluster-policy-controller/4b0225f6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/9c2a3394 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-cert-syncer/e820b243 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/1ca52ea0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c776,c1007 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/f614b9022728cf315e60c057852e563e/containers/kube-controller-manager-recovery-controller/e6988e45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c214,c928 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes/kubernetes.io~configmap/mcc-auth-proxy-config/..2025_02_24_06_09_21.2517297950/config-file.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/6655f00b not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/98bc3986 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/machine-config-controller/08e3458a not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/2a191cb0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/6c4eeefb not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/containers/kube-rbac-proxy/f61a549c not reset as customized by admin to system_u:object_r:container_file_t:s0:c4,c17 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/24891863 not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/hostpath-provisioner/fbdfd89c not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/9b63b3bc not reset as customized by admin to system_u:object_r:container_file_t:s0:c37,c572 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/liveness-probe/8acde6d6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/node-driver-registrar/59ecbba3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/containers/csi-provisioner/685d4be3 not reset as customized by admin to system_u:object_r:container_file_t:s0:c318,c553 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..2025_02_24_06_20_07.341639300/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/config.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.client-ca.configmap not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/config/openshift-route-controller-manager.serving-cert.secret not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851 not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..2025_02_24_06_20_07.2950937851/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes/kubernetes.io~configmap/client-ca/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/containers/route-controller-manager/feaea55e not reset as customized by admin to system_u:object_r:container_file_t:s0:c2,c23 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abinitio-runtime-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/accuknox-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aci-containers-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airlock-microgateway/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ako-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloy/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anchore-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-cloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/appdynamics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-dcap-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ccm-node-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cfm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cilium-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloud-native-postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudera-streams-messaging-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudnative-pg/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cnfv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/conjur-follower-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/coroot-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cte-k8s-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-deploy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/digitalai-release-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edb-hcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/elasticsearch-eck-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/federatorai-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fujitsu-enterprise-postgres-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/function-mesh/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/harness-gitops-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hcp-terraform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hpe-ezmeral-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-application-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-directory-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-dr-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-licensing-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infoscale-sds-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infrastructure-asset-orchestrator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-device-plugins-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/intel-kubernetes-power-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-openshift-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8s-triliovault/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-ati-updates/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-framework/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-ingress/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-licensing/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-kcos-sso/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-load-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-loadcore-agents/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nats-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-nimbusmosaic-dusim/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-rest-api-browser-v1/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-appsec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-db/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-diagnostics/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-logging/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-migration/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-msg-broker/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-notifications/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-stats-dashboards/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-storage/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-test-core/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-wap-ui/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keysight-websocket-service/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kong-gateway-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubearmor-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lenovo-locd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memcached-operator-ogaye/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/memory-machine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-enterprise/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netapp-spark-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-adm-agent-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netscaler-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-repository-ha-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nginx-ingress-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nim-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxiq-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nxrm-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odigos-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/open-liberty-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftartifactoryha-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshiftxray-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/operator-certification-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pmem-csi-operator-os/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-component-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/runtime-fabric-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sanstoragecsi-operator-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/smilecdr-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sriov-fec/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-commons-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stackable-zookeeper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-tsc-client-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tawon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tigera-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vcp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/webotx-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/63709497 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/d966b7fd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-utilities/f5773757 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/81c9edb9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/57bf57ee not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/extract-content/86f5e6aa not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/0aabe31d not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/d2af85c2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/containers/registry-server/09d157d9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/3scale-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-acmpca-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigateway-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-apigatewayv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-applicationautoscaling-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-athena-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudfront-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudtrail-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatch-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-cloudwatchlogs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-documentdb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-dynamodb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ec2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecr-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ecs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-efs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eks-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elasticache-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-elbv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-emrcontainers-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-eventbridge-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-iam-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kafka-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-keyspaces-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kinesis-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-kms-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-lambda-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-memorydb-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-mq-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-networkfirewall-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-opensearchservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-organizations-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-pipes-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-prometheusservice-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-rds-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-recyclebin-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-route53resolver-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-s3-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sagemaker-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-secretsmanager-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ses-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sfn-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sns-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-sqs-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-ssm-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ack-wafv2-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/airflow-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alloydb-omni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/alvearie-imaging-ingestion/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/amd-gpu-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/analytics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/annotationlab/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicast-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-api-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurio-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apicurito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/apimatic-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/application-services-metering-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aqua/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/argocd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/assisted-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/authorino-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/automotive-infra/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aws-efs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/awss3-operator-registry/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/azure-service-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/beegfs-csi-driver-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/bpfman-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-k/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/camel-karavan-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cass-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cert-utils-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-aas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-impairment-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cluster-manager/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/codeflare-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-kubevirt-hyperconverged/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-trivy-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/community-windows-machine-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/customized-user-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cxl-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dapr-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datatrucker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dbaas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/debezium-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dell-csm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/deployment-validation-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/devopsinabox/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-amlen-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eclipse-che/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ecr-secret-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/edp-keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eginnovations-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/egressip-ipam-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ember-csi-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/etcd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/eventing-kogito/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/external-secrets-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/falcon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fence-agents-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flink-kubernetes-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k8gb/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/fossul-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/github-arc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitops-primer/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/gitwebhook-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/global-load-balancer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/grafana-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/group-sync-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hawtio-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hazelcast-platform-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hedvig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hive-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/horreum-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/hyperfoil-bundle/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-block-csi-operator-community/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-security-verify-access-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibm-spectrum-scale-csi-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ibmcloud-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/infinispan/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/integrity-shield-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ipfs-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/istio-workspace-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/jaeger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kaoto-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keda/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keepalived-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/keycloak-permissions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/klusterlet/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kogito-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/koku-metrics-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/konveyor-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/korrel8r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kuadrant-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kube-green/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubecost/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubernetes-imagepuller-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/l5-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/layer7-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lbconfig-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/lib-bucket-provisioner/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/limitador-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/logging-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/loki-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/machine-deletion-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mariadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marin3r/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mercury-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/microcks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-atlas-kubernetes/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/mongodb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/move2kube-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multi-nic-cni-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-global-hub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/multicluster-operators-subscription/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/must-gather-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/namespace-configuration-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ncn-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ndmspc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/netobserv-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-community-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nexus-operator-m88i/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nfs-provisioner-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nlp-server/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-discovery-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-healthcheck-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/node-maintenance-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/nsm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oadp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/observability-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/oci-ccm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ocm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/odoo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opendatahub-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openebs/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-nfd-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-node-upgrade-mutex-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/openshift-qiskit-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/opentelemetry-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patch-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/patterns-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pcc-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pelorus-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/percona-xtradb-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/portworx-essentials/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/postgresql/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/proactive-node-scaling-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/project-quay/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometheus-exporter-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/prometurbo/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pubsubplus-eventbroker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pulp-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-cluster-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rabbitmq-messaging-topology-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/reportportal-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/resource-locker-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/rhoas-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ripsaw/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sailoperator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-commerce-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-data-intelligence-observer-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sap-hana-express-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/self-node-remediation/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/service-binding-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/shipwright-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sigstore-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/silicom-sts-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/skupper-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snapscheduler/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/snyk-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/socmmd/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonar-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosivio/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sonataflow-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/sosreport-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/spark-helm-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/special-resource-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/stolostron-engine/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/strimzi-kafka-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/syndesis/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tagger/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tempo-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tf-controller/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/tidb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trident-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/trustify-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ucs-ci-solutions-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/universal-crossplane/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/varnish-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vault-config-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/verticadb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/volume-expander-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/wandb-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/windup-operator/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yaks/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c0fe7256 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/c30319e4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-utilities/e6b1dd45 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/2bb643f0 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/920de426 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/extract-content/70fa1e87 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/a1c12a2f not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/9442e6c7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/containers/registry-server/5b45ec72 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/abot-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aerospike-kubernetes-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/aikit-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzo-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzograph-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/anzounstructured-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cloudbees-ci-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/cockroachdb-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/crunchy-postgres-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/datadog-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/dynatrace-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/entando-k8s-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/flux/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/instana-agent-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/iomesh-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/joget-dx8-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/k10-kasten-operator-term-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubemq-operator-marketplace-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/kubeturbo-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/linstor-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/marketplace-games-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/model-builder-for-vision-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/neuvector-certified-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/ovms-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/pachyderm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/redis-enterprise-operator-cert-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/seldon-deploy-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-paygo-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/starburst-enterprise-helm-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/t8c-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/timemachine-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/vfunction-server-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/xcrypt-operator-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/yugabyte-platform-operator-bundle-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/catalog/zabbix-operator-certified-rhmp/catalog.json not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/00000-1.psg.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/db.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/index.pmt not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/main.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/db/overflow.pix not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/catalog-content/cache/pogreb.v1/digest not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes/kubernetes.io~empty-dir/utilities/copy-content not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/3c9f3a59 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/1091c11b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-utilities/9a6821c6 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/ec0c35e2 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/517f37e7 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/extract-content/6214fe78 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/ba189c8b not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/351e4f31 not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/containers/registry-server/c0f219ff not reset as customized by admin to system_u:object_r:container_file_t:s0:c7,c13 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/8069f607 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/559c3d82 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/wait-for-host-port/605ad488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/148df488 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/3bf6dcb4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler/022a2feb not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/938c3924 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/729fe23e not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-cert-syncer/1fd5cbd4 not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/a96697e1 not reset as customized by admin to system_u:object_r:container_file_t:s0:c378,c723 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/e155ddca not reset as customized by admin to system_u:object_r:container_file_t:s0:c133,c223 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/3dcd261975c3d6b9a6ad6367fd4facd3/containers/kube-scheduler-recovery-controller/10dd0e0f not reset as customized by admin to system_u:object_r:container_file_t:s0:c247,c522 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..2025_02_24_06_09_35.3018472960/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-trusted-ca-bundle/ca-bundle.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..2025_02_24_06_09_35.4262376737/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/audit-policies/audit.yaml not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..2025_02_24_06_09_35.2630275752/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-cliconfig/v4-0-config-system-cliconfig not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..2025_02_24_06_09_35.2376963788/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/..data not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes/kubernetes.io~configmap/v4-0-config-system-service-ca/service-ca.crt not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/etc-hosts not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/6f2c8392 not reset as customized by admin to system_u:object_r:container_file_t:s0:c267,c588 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/containers/oauth-openshift/bd241ad9 not reset as customized by admin to system_u:object_r:container_file_t:s0:c682,c947 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/csi-hostpath/csi.sock not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983 not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/vol_data.json not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: /var/lib/kubelet/plugins_registry not reset as customized by admin to system_u:object_r:container_file_t:s0 Jan 31 04:26:31 crc restorecon[4682]: Relabeled /var/usrlocal/bin/kubenswrapper from system_u:object_r:bin_t:s0 to system_u:object_r:kubelet_exec_t:s0 Jan 31 04:26:32 crc kubenswrapper[4812]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 04:26:32 crc kubenswrapper[4812]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Jan 31 04:26:32 crc kubenswrapper[4812]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 04:26:32 crc kubenswrapper[4812]: Flag --register-with-taints has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 04:26:32 crc kubenswrapper[4812]: Flag --pod-infra-container-image has been deprecated, will be removed in a future release. Image garbage collector will get sandbox image information from CRI. Jan 31 04:26:32 crc kubenswrapper[4812]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.076394 4812 server.go:211] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086041 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086106 4812 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086113 4812 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086120 4812 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086125 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086131 4812 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086136 4812 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086142 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086148 4812 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086153 4812 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086158 4812 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086163 4812 feature_gate.go:330] unrecognized feature gate: Example Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086168 4812 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086174 4812 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086179 4812 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086185 4812 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086190 4812 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086197 4812 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086205 4812 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086211 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086216 4812 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086222 4812 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086227 4812 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086233 4812 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086237 4812 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086242 4812 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086249 4812 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086255 4812 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086261 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086266 4812 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086279 4812 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086284 4812 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086289 4812 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086295 4812 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086302 4812 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086309 4812 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086314 4812 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086319 4812 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086326 4812 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086332 4812 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086338 4812 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086344 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086350 4812 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086356 4812 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086361 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086366 4812 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086372 4812 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086377 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086381 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086386 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086391 4812 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086396 4812 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086401 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086406 4812 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086411 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086419 4812 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086424 4812 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086429 4812 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086434 4812 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086439 4812 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086444 4812 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086450 4812 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086455 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086460 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086465 4812 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086469 4812 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086474 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086479 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086486 4812 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086491 4812 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.086495 4812 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086606 4812 flags.go:64] FLAG: --address="0.0.0.0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086623 4812 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086636 4812 flags.go:64] FLAG: --anonymous-auth="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086644 4812 flags.go:64] FLAG: --application-metrics-count-limit="100" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086654 4812 flags.go:64] FLAG: --authentication-token-webhook="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086660 4812 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086668 4812 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086675 4812 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086683 4812 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086690 4812 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086698 4812 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086708 4812 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086716 4812 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086723 4812 flags.go:64] FLAG: --cgroup-root="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086730 4812 flags.go:64] FLAG: --cgroups-per-qos="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086737 4812 flags.go:64] FLAG: --client-ca-file="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086744 4812 flags.go:64] FLAG: --cloud-config="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086751 4812 flags.go:64] FLAG: --cloud-provider="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086758 4812 flags.go:64] FLAG: --cluster-dns="[]" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086767 4812 flags.go:64] FLAG: --cluster-domain="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086774 4812 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086781 4812 flags.go:64] FLAG: --config-dir="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086786 4812 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086793 4812 flags.go:64] FLAG: --container-log-max-files="5" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086801 4812 flags.go:64] FLAG: --container-log-max-size="10Mi" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086807 4812 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086813 4812 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086819 4812 flags.go:64] FLAG: --containerd-namespace="k8s.io" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086825 4812 flags.go:64] FLAG: --contention-profiling="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086831 4812 flags.go:64] FLAG: --cpu-cfs-quota="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086867 4812 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086876 4812 flags.go:64] FLAG: --cpu-manager-policy="none" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086882 4812 flags.go:64] FLAG: --cpu-manager-policy-options="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086892 4812 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086900 4812 flags.go:64] FLAG: --enable-controller-attach-detach="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086909 4812 flags.go:64] FLAG: --enable-debugging-handlers="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086915 4812 flags.go:64] FLAG: --enable-load-reader="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086922 4812 flags.go:64] FLAG: --enable-server="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086928 4812 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086939 4812 flags.go:64] FLAG: --event-burst="100" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086947 4812 flags.go:64] FLAG: --event-qps="50" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086954 4812 flags.go:64] FLAG: --event-storage-age-limit="default=0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086961 4812 flags.go:64] FLAG: --event-storage-event-limit="default=0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086968 4812 flags.go:64] FLAG: --eviction-hard="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086977 4812 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086984 4812 flags.go:64] FLAG: --eviction-minimum-reclaim="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086991 4812 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.086999 4812 flags.go:64] FLAG: --eviction-soft="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087006 4812 flags.go:64] FLAG: --eviction-soft-grace-period="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087013 4812 flags.go:64] FLAG: --exit-on-lock-contention="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087020 4812 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087027 4812 flags.go:64] FLAG: --experimental-mounter-path="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087034 4812 flags.go:64] FLAG: --fail-cgroupv1="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087040 4812 flags.go:64] FLAG: --fail-swap-on="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087047 4812 flags.go:64] FLAG: --feature-gates="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087055 4812 flags.go:64] FLAG: --file-check-frequency="20s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087062 4812 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087070 4812 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087077 4812 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087084 4812 flags.go:64] FLAG: --healthz-port="10248" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087091 4812 flags.go:64] FLAG: --help="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087098 4812 flags.go:64] FLAG: --hostname-override="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087105 4812 flags.go:64] FLAG: --housekeeping-interval="10s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087112 4812 flags.go:64] FLAG: --http-check-frequency="20s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087119 4812 flags.go:64] FLAG: --image-credential-provider-bin-dir="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087126 4812 flags.go:64] FLAG: --image-credential-provider-config="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087134 4812 flags.go:64] FLAG: --image-gc-high-threshold="85" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087142 4812 flags.go:64] FLAG: --image-gc-low-threshold="80" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087149 4812 flags.go:64] FLAG: --image-service-endpoint="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087156 4812 flags.go:64] FLAG: --kernel-memcg-notification="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087162 4812 flags.go:64] FLAG: --kube-api-burst="100" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087169 4812 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087176 4812 flags.go:64] FLAG: --kube-api-qps="50" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087183 4812 flags.go:64] FLAG: --kube-reserved="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087190 4812 flags.go:64] FLAG: --kube-reserved-cgroup="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087197 4812 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087204 4812 flags.go:64] FLAG: --kubelet-cgroups="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087211 4812 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087218 4812 flags.go:64] FLAG: --lock-file="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087224 4812 flags.go:64] FLAG: --log-cadvisor-usage="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087231 4812 flags.go:64] FLAG: --log-flush-frequency="5s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087238 4812 flags.go:64] FLAG: --log-json-info-buffer-size="0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087251 4812 flags.go:64] FLAG: --log-json-split-stream="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087259 4812 flags.go:64] FLAG: --log-text-info-buffer-size="0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087266 4812 flags.go:64] FLAG: --log-text-split-stream="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087279 4812 flags.go:64] FLAG: --logging-format="text" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087289 4812 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087298 4812 flags.go:64] FLAG: --make-iptables-util-chains="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087305 4812 flags.go:64] FLAG: --manifest-url="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087312 4812 flags.go:64] FLAG: --manifest-url-header="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087324 4812 flags.go:64] FLAG: --max-housekeeping-interval="15s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087331 4812 flags.go:64] FLAG: --max-open-files="1000000" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087340 4812 flags.go:64] FLAG: --max-pods="110" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087347 4812 flags.go:64] FLAG: --maximum-dead-containers="-1" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087354 4812 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087362 4812 flags.go:64] FLAG: --memory-manager-policy="None" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087368 4812 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087377 4812 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087384 4812 flags.go:64] FLAG: --node-ip="192.168.126.11" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087391 4812 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/control-plane=,node-role.kubernetes.io/master=,node.openshift.io/os_id=rhcos" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087411 4812 flags.go:64] FLAG: --node-status-max-images="50" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087420 4812 flags.go:64] FLAG: --node-status-update-frequency="10s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087427 4812 flags.go:64] FLAG: --oom-score-adj="-999" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087434 4812 flags.go:64] FLAG: --pod-cidr="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087440 4812 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:33549946e22a9ffa738fd94b1345f90921bc8f92fa6137784cb33c77ad806f9d" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087455 4812 flags.go:64] FLAG: --pod-manifest-path="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087462 4812 flags.go:64] FLAG: --pod-max-pids="-1" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087469 4812 flags.go:64] FLAG: --pods-per-core="0" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087475 4812 flags.go:64] FLAG: --port="10250" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087482 4812 flags.go:64] FLAG: --protect-kernel-defaults="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087489 4812 flags.go:64] FLAG: --provider-id="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087496 4812 flags.go:64] FLAG: --qos-reserved="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087503 4812 flags.go:64] FLAG: --read-only-port="10255" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087510 4812 flags.go:64] FLAG: --register-node="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087517 4812 flags.go:64] FLAG: --register-schedulable="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087524 4812 flags.go:64] FLAG: --register-with-taints="node-role.kubernetes.io/master=:NoSchedule" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087537 4812 flags.go:64] FLAG: --registry-burst="10" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087545 4812 flags.go:64] FLAG: --registry-qps="5" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087552 4812 flags.go:64] FLAG: --reserved-cpus="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087560 4812 flags.go:64] FLAG: --reserved-memory="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087568 4812 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087575 4812 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087582 4812 flags.go:64] FLAG: --rotate-certificates="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087589 4812 flags.go:64] FLAG: --rotate-server-certificates="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087596 4812 flags.go:64] FLAG: --runonce="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087603 4812 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087609 4812 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087616 4812 flags.go:64] FLAG: --seccomp-default="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087623 4812 flags.go:64] FLAG: --serialize-image-pulls="true" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087630 4812 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087637 4812 flags.go:64] FLAG: --storage-driver-db="cadvisor" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087644 4812 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087651 4812 flags.go:64] FLAG: --storage-driver-password="root" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087657 4812 flags.go:64] FLAG: --storage-driver-secure="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087664 4812 flags.go:64] FLAG: --storage-driver-table="stats" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087671 4812 flags.go:64] FLAG: --storage-driver-user="root" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087677 4812 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087683 4812 flags.go:64] FLAG: --sync-frequency="1m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087690 4812 flags.go:64] FLAG: --system-cgroups="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087696 4812 flags.go:64] FLAG: --system-reserved="cpu=200m,ephemeral-storage=350Mi,memory=350Mi" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087708 4812 flags.go:64] FLAG: --system-reserved-cgroup="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087716 4812 flags.go:64] FLAG: --tls-cert-file="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087722 4812 flags.go:64] FLAG: --tls-cipher-suites="[]" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087730 4812 flags.go:64] FLAG: --tls-min-version="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087737 4812 flags.go:64] FLAG: --tls-private-key-file="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087743 4812 flags.go:64] FLAG: --topology-manager-policy="none" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087750 4812 flags.go:64] FLAG: --topology-manager-policy-options="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087757 4812 flags.go:64] FLAG: --topology-manager-scope="container" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087764 4812 flags.go:64] FLAG: --v="2" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087774 4812 flags.go:64] FLAG: --version="false" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087790 4812 flags.go:64] FLAG: --vmodule="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087798 4812 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.087805 4812 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088001 4812 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088011 4812 feature_gate.go:330] unrecognized feature gate: Example Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088019 4812 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088025 4812 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088030 4812 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088036 4812 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088041 4812 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088050 4812 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088056 4812 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088063 4812 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088069 4812 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088075 4812 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088081 4812 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088087 4812 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088093 4812 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088099 4812 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088105 4812 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088112 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088119 4812 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088124 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088130 4812 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088135 4812 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088141 4812 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088147 4812 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088153 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088158 4812 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088164 4812 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088169 4812 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088176 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088188 4812 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088194 4812 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088199 4812 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088204 4812 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088210 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088216 4812 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088221 4812 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088227 4812 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088233 4812 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088240 4812 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088246 4812 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088252 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088257 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088263 4812 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088268 4812 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088274 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088280 4812 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088286 4812 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088292 4812 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088297 4812 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088305 4812 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088312 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088318 4812 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088326 4812 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088334 4812 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088341 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088348 4812 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088355 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088362 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088367 4812 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088373 4812 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088379 4812 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088388 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088395 4812 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088401 4812 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088407 4812 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088413 4812 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088420 4812 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088426 4812 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088432 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088439 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.088446 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.088465 4812 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.100655 4812 server.go:491] "Kubelet version" kubeletVersion="v1.31.5" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.100687 4812 server.go:493] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100811 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100825 4812 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100861 4812 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100870 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100879 4812 feature_gate.go:330] unrecognized feature gate: Example Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100887 4812 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100895 4812 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100903 4812 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100911 4812 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100919 4812 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100927 4812 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100934 4812 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100943 4812 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100950 4812 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100958 4812 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100966 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100973 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100981 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100989 4812 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.100997 4812 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101005 4812 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101012 4812 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101023 4812 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101034 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101045 4812 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101054 4812 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101063 4812 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101072 4812 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101081 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101091 4812 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101099 4812 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101108 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101118 4812 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101130 4812 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101140 4812 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101150 4812 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101158 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101166 4812 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101174 4812 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101183 4812 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101191 4812 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101201 4812 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101210 4812 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101218 4812 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101227 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101235 4812 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101242 4812 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101250 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101258 4812 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101266 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101273 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101281 4812 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101288 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101297 4812 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101305 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101313 4812 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101321 4812 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101329 4812 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101337 4812 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101345 4812 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101353 4812 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101361 4812 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101369 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101379 4812 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101388 4812 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101399 4812 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101408 4812 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101417 4812 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101425 4812 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101455 4812 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101464 4812 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.101477 4812 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101705 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstallIBMCloud Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101718 4812 feature_gate.go:330] unrecognized feature gate: PersistentIPsForVirtualization Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101727 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerDynamicConfigurationManager Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101735 4812 feature_gate.go:330] unrecognized feature gate: ExternalOIDC Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101743 4812 feature_gate.go:330] unrecognized feature gate: OnClusterBuild Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101750 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101759 4812 feature_gate.go:330] unrecognized feature gate: ConsolePluginContentSecurityPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101767 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101775 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101782 4812 feature_gate.go:330] unrecognized feature gate: MachineAPIProviderOpenStack Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101790 4812 feature_gate.go:330] unrecognized feature gate: SetEIPForNLBIngressController Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101798 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfigAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101805 4812 feature_gate.go:330] unrecognized feature gate: MachineConfigNodes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101815 4812 feature_gate.go:330] unrecognized feature gate: VSphereDriverConfiguration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101824 4812 feature_gate.go:330] unrecognized feature gate: CSIDriverSharedResource Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101832 4812 feature_gate.go:330] unrecognized feature gate: NetworkDiagnosticsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101864 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImagesAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101872 4812 feature_gate.go:330] unrecognized feature gate: AdditionalRoutingCapabilities Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101880 4812 feature_gate.go:330] unrecognized feature gate: VSphereControlPlaneMachineSet Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101887 4812 feature_gate.go:330] unrecognized feature gate: NodeDisruptionPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101895 4812 feature_gate.go:330] unrecognized feature gate: HardwareSpeed Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101903 4812 feature_gate.go:330] unrecognized feature gate: SignatureStores Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101911 4812 feature_gate.go:330] unrecognized feature gate: BareMetalLoadBalancer Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101919 4812 feature_gate.go:330] unrecognized feature gate: AdminNetworkPolicy Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101926 4812 feature_gate.go:330] unrecognized feature gate: NutanixMultiSubnets Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101934 4812 feature_gate.go:330] unrecognized feature gate: UpgradeStatus Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101941 4812 feature_gate.go:330] unrecognized feature gate: InsightsOnDemandDataGather Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101949 4812 feature_gate.go:330] unrecognized feature gate: DNSNameResolver Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101956 4812 feature_gate.go:330] unrecognized feature gate: NewOLM Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101965 4812 feature_gate.go:330] unrecognized feature gate: VolumeGroupSnapshot Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101973 4812 feature_gate.go:330] unrecognized feature gate: PinnedImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101981 4812 feature_gate.go:330] unrecognized feature gate: RouteAdvertisements Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101989 4812 feature_gate.go:330] unrecognized feature gate: NetworkSegmentation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.101997 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallAzure Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102005 4812 feature_gate.go:330] unrecognized feature gate: IngressControllerLBSubnetsAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102013 4812 feature_gate.go:330] unrecognized feature gate: EtcdBackendQuota Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102021 4812 feature_gate.go:330] unrecognized feature gate: ImageStreamImportMode Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102047 4812 feature_gate.go:330] unrecognized feature gate: ClusterAPIInstall Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102055 4812 feature_gate.go:330] unrecognized feature gate: OVNObservability Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102063 4812 feature_gate.go:330] unrecognized feature gate: GCPLabelsTags Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102070 4812 feature_gate.go:330] unrecognized feature gate: AutomatedEtcdBackup Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102078 4812 feature_gate.go:330] unrecognized feature gate: GatewayAPI Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102085 4812 feature_gate.go:330] unrecognized feature gate: ManagedBootImages Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102093 4812 feature_gate.go:330] unrecognized feature gate: BuildCSIVolumes Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102102 4812 feature_gate.go:330] unrecognized feature gate: PrivateHostedZoneAWS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102112 4812 feature_gate.go:353] Setting GA feature gate CloudDualStackNodeIPs=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102122 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiNetworks Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102131 4812 feature_gate.go:330] unrecognized feature gate: NetworkLiveMigration Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102140 4812 feature_gate.go:330] unrecognized feature gate: ChunkSizeMiB Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102149 4812 feature_gate.go:330] unrecognized feature gate: ClusterMonitoringConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102157 4812 feature_gate.go:330] unrecognized feature gate: VSphereMultiVCenters Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102165 4812 feature_gate.go:330] unrecognized feature gate: MinimumKubeletVersion Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102173 4812 feature_gate.go:330] unrecognized feature gate: AWSClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102181 4812 feature_gate.go:330] unrecognized feature gate: SigstoreImageVerification Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102188 4812 feature_gate.go:330] unrecognized feature gate: MultiArchInstallGCP Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102199 4812 feature_gate.go:353] Setting GA feature gate DisableKubeletCloudCredentialProviders=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102209 4812 feature_gate.go:330] unrecognized feature gate: AzureWorkloadIdentity Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102219 4812 feature_gate.go:330] unrecognized feature gate: Example Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102227 4812 feature_gate.go:330] unrecognized feature gate: AlibabaPlatform Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102234 4812 feature_gate.go:330] unrecognized feature gate: InsightsRuntimeExtractor Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102244 4812 feature_gate.go:351] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102254 4812 feature_gate.go:330] unrecognized feature gate: BootcNodeManagement Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102262 4812 feature_gate.go:330] unrecognized feature gate: MixedCPUsAllocation Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102271 4812 feature_gate.go:330] unrecognized feature gate: InsightsConfig Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102278 4812 feature_gate.go:330] unrecognized feature gate: GCPClusterHostedDNS Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102289 4812 feature_gate.go:330] unrecognized feature gate: AWSEFSDriverVolumeMetrics Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102297 4812 feature_gate.go:330] unrecognized feature gate: PlatformOperators Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102304 4812 feature_gate.go:330] unrecognized feature gate: MetricsCollectionProfiles Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102314 4812 feature_gate.go:353] Setting GA feature gate ValidatingAdmissionPolicy=true. It will be removed in a future release. Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102324 4812 feature_gate.go:330] unrecognized feature gate: VSphereStaticIPs Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.102333 4812 feature_gate.go:330] unrecognized feature gate: OpenShiftPodSecurityAdmission Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.102345 4812 feature_gate.go:386] feature gates: {map[CloudDualStackNodeIPs:true DisableKubeletCloudCredentialProviders:true DynamicResourceAllocation:false EventedPLEG:false KMSv1:true MaxUnavailableStatefulSet:false NodeSwap:false ProcMountType:false RouteExternalCertificate:false ServiceAccountTokenNodeBinding:false TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:false UserNamespacesSupport:false ValidatingAdmissionPolicy:true VolumeAttributesClass:false]} Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.103792 4812 server.go:940] "Client rotation is on, will bootstrap in background" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.109210 4812 bootstrap.go:85] "Current kubeconfig file contents are still valid, no bootstrap necessary" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.109347 4812 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-client-current.pem". Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.111192 4812 server.go:997] "Starting client certificate rotation" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.111239 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate rotation is enabled Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.112500 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2026-02-24 05:52:08 +0000 UTC, rotation deadline is 2025-11-16 18:21:54.096744668 +0000 UTC Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.112692 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.141664 4812 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.144570 4812 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.145056 4812 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.165812 4812 log.go:25] "Validated CRI v1 runtime API" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.208477 4812 log.go:25] "Validated CRI v1 image API" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.210556 4812 server.go:1437] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.218116 4812 fs.go:133] Filesystem UUIDs: map[0b076daa-c26a-46d2-b3a6-72a8dbc6e257:/dev/vda4 2026-01-31-04-22-21-00:/dev/sr0 7B77-95E7:/dev/vda2 de0497b0-db1b-465a-b278-03db02455c71:/dev/vda3] Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.218189 4812 fs.go:134] Filesystem partitions: map[/dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /dev/vda3:{mountpoint:/boot major:252 minor:3 fsType:ext4 blockSize:0} /dev/vda4:{mountpoint:/var major:252 minor:4 fsType:xfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /run/user/1000:{mountpoint:/run/user/1000 major:0 minor:44 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:30 fsType:tmpfs blockSize:0} /var/lib/etcd:{mountpoint:/var/lib/etcd major:0 minor:42 fsType:tmpfs blockSize:0}] Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.246184 4812 manager.go:217] Machine: {Timestamp:2026-01-31 04:26:32.242802217 +0000 UTC m=+0.737823952 CPUVendorID:AuthenticAMD NumCores:12 NumPhysicalCores:1 NumSockets:12 CpuFrequency:2799998 MemoryCapacity:33654124544 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:21801e6708c44f15b81395eb736a7cec SystemUUID:9730f4f2-835d-4e9b-a74d-461488f96726 BootID:3069a142-20b2-4287-9a2d-d92558a419a1 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16827060224 Type:vfs Inodes:4108169 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6730825728 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/vda4 DeviceMajor:252 DeviceMinor:4 Capacity:85292941312 Type:vfs Inodes:41679680 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:30 Capacity:16827064320 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/vda3 DeviceMajor:252 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/run/user/1000 DeviceMajor:0 DeviceMinor:44 Capacity:3365408768 Type:vfs Inodes:821633 HasInodes:true} {Device:/var/lib/etcd DeviceMajor:0 DeviceMinor:42 Capacity:1073741824 Type:vfs Inodes:4108169 HasInodes:true}] DiskMap:map[252:0:{Name:vda Major:252 Minor:0 Size:214748364800 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:fa:16:3e:b9:2c:f1 Speed:0 Mtu:1500} {Name:br-int MacAddress:d6:39:55:2e:22:71 Speed:0 Mtu:1400} {Name:ens3 MacAddress:fa:16:3e:b9:2c:f1 Speed:-1 Mtu:1500} {Name:ens7 MacAddress:fa:16:3e:b8:0b:dc Speed:-1 Mtu:1500} {Name:ens7.20 MacAddress:52:54:00:1a:58:e0 Speed:-1 Mtu:1496} {Name:ens7.21 MacAddress:52:54:00:ee:78:ea Speed:-1 Mtu:1496} {Name:ens7.22 MacAddress:52:54:00:1a:e5:a5 Speed:-1 Mtu:1496} {Name:eth10 MacAddress:4a:c9:80:f9:91:b5 Speed:0 Mtu:1500} {Name:ovn-k8s-mp0 MacAddress:0a:58:0a:d9:00:02 Speed:0 Mtu:1400} {Name:ovs-system MacAddress:c2:9c:3b:49:71:ee Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33654124544 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:0 Size:16777216 Type:Unified Level:3}] SocketID:0 BookID: DrawerID:} {Id:0 Threads:[1] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:1 Size:16777216 Type:Unified Level:3}] SocketID:1 BookID: DrawerID:} {Id:0 Threads:[10] Caches:[{Id:10 Size:32768 Type:Data Level:1} {Id:10 Size:32768 Type:Instruction Level:1} {Id:10 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:10 Size:16777216 Type:Unified Level:3}] SocketID:10 BookID: DrawerID:} {Id:0 Threads:[11] Caches:[{Id:11 Size:32768 Type:Data Level:1} {Id:11 Size:32768 Type:Instruction Level:1} {Id:11 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:11 Size:16777216 Type:Unified Level:3}] SocketID:11 BookID: DrawerID:} {Id:0 Threads:[2] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:2 Size:16777216 Type:Unified Level:3}] SocketID:2 BookID: DrawerID:} {Id:0 Threads:[3] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:3 Size:16777216 Type:Unified Level:3}] SocketID:3 BookID: DrawerID:} {Id:0 Threads:[4] Caches:[{Id:4 Size:32768 Type:Data Level:1} {Id:4 Size:32768 Type:Instruction Level:1} {Id:4 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:4 Size:16777216 Type:Unified Level:3}] SocketID:4 BookID: DrawerID:} {Id:0 Threads:[5] Caches:[{Id:5 Size:32768 Type:Data Level:1} {Id:5 Size:32768 Type:Instruction Level:1} {Id:5 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:5 Size:16777216 Type:Unified Level:3}] SocketID:5 BookID: DrawerID:} {Id:0 Threads:[6] Caches:[{Id:6 Size:32768 Type:Data Level:1} {Id:6 Size:32768 Type:Instruction Level:1} {Id:6 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:6 Size:16777216 Type:Unified Level:3}] SocketID:6 BookID: DrawerID:} {Id:0 Threads:[7] Caches:[{Id:7 Size:32768 Type:Data Level:1} {Id:7 Size:32768 Type:Instruction Level:1} {Id:7 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:7 Size:16777216 Type:Unified Level:3}] SocketID:7 BookID: DrawerID:} {Id:0 Threads:[8] Caches:[{Id:8 Size:32768 Type:Data Level:1} {Id:8 Size:32768 Type:Instruction Level:1} {Id:8 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:8 Size:16777216 Type:Unified Level:3}] SocketID:8 BookID: DrawerID:} {Id:0 Threads:[9] Caches:[{Id:9 Size:32768 Type:Data Level:1} {Id:9 Size:32768 Type:Instruction Level:1} {Id:9 Size:524288 Type:Unified Level:2}] UncoreCaches:[{Id:9 Size:16777216 Type:Unified Level:3}] SocketID:9 BookID: DrawerID:}] Caches:[] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.246567 4812 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.246765 4812 manager.go:233] Version: {KernelVersion:5.14.0-427.50.2.el9_4.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 418.94.202502100215-0 DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.247260 4812 swap_util.go:113] "Swap is on" /proc/swaps contents="Filename\t\t\t\tType\t\tSize\t\tUsed\t\tPriority" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.247589 4812 container_manager_linux.go:267] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.247650 4812 container_manager_linux.go:272] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"crc","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"200m","ephemeral-storage":"350Mi","memory":"350Mi"},"HardEvictionThresholds":[{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"ExperimentalMemoryManagerPolicy":"None","ExperimentalMemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.247971 4812 topology_manager.go:138] "Creating topology manager with none policy" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.247990 4812 container_manager_linux.go:303] "Creating device plugin manager" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.248493 4812 manager.go:142] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.248534 4812 server.go:66] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.249372 4812 state_mem.go:36] "Initialized new in-memory state store" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.249566 4812 server.go:1245] "Using root directory" path="/var/lib/kubelet" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.253220 4812 kubelet.go:418] "Attempting to sync node with API server" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.253252 4812 kubelet.go:313] "Adding static pod path" path="/etc/kubernetes/manifests" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.253324 4812 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.253344 4812 kubelet.go:324] "Adding apiserver pod source" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.253361 4812 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.260050 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.260134 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.260190 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.260277 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.260814 4812 kuberuntime_manager.go:262] "Container runtime initialized" containerRuntime="cri-o" version="1.31.5-4.rhaos4.18.gitdad78d5.el9" apiVersion="v1" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.262151 4812 certificate_store.go:130] Loading cert/key pair from "/var/lib/kubelet/pki/kubelet-server-current.pem". Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.265051 4812 kubelet.go:854] "Not starting ClusterTrustBundle informer because we are in static kubelet mode" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266764 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266807 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266820 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266835 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266887 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266900 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/secret" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266913 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266933 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266948 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/fc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266961 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.266990 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/projected" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.267003 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.267929 4812 plugins.go:603] "Loaded volume plugin" pluginName="kubernetes.io/csi" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.268971 4812 server.go:1280] "Started kubelet" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.269369 4812 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.269980 4812 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.270015 4812 server.go:163] "Starting to listen" address="0.0.0.0" port=10250 Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.270539 4812 server.go:236] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Jan 31 04:26:32 crc systemd[1]: Started Kubernetes Kubelet. Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.271156 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate rotation is enabled Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.271183 4812 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.271322 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 16:59:55.54473228 +0000 UTC Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.271618 4812 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.271693 4812 volume_manager.go:287] "The desired_state_of_world populator starts" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.271720 4812 volume_manager.go:289] "Starting Kubelet Volume Manager" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.271991 4812 desired_state_of_world_populator.go:146] "Desired state populator starts to run" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.274822 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.274911 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.275815 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="200ms" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.277287 4812 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fb64672b7e746 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 04:26:32.268515142 +0000 UTC m=+0.763536837,LastTimestamp:2026-01-31 04:26:32.268515142 +0000 UTC m=+0.763536837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.282795 4812 factory.go:219] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.282880 4812 server.go:460] "Adding debug handlers to kubelet server" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.282886 4812 factory.go:55] Registering systemd factory Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.283067 4812 factory.go:221] Registration of the systemd container factory successfully Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.284015 4812 factory.go:153] Registering CRI-O factory Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.284040 4812 factory.go:221] Registration of the crio container factory successfully Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.284061 4812 factory.go:103] Registering Raw factory Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.284078 4812 manager.go:1196] Started watching for new ooms in manager Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.285116 4812 manager.go:319] Starting recovery of all containers Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288232 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288400 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288414 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288427 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288439 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288452 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3b6479f0-333b-4a96-9adf-2099afdc2447" volumeName="kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288465 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288478 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288491 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288504 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288516 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288530 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288544 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288563 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288594 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="44663579-783b-4372-86d6-acf235a62d72" volumeName="kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288609 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288623 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288635 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288649 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288663 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288676 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288691 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288706 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288719 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288733 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288744 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288767 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288781 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288794 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288806 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" volumeName="kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288819 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288833 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288864 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288877 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288889 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288901 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288914 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288925 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288936 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288948 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288959 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288970 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288981 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.288993 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289005 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289016 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289028 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289042 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289054 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289065 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289078 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289090 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289105 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289117 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289130 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289142 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289154 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289165 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289176 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289187 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289199 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289210 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289221 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289233 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289244 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289255 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289267 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289280 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289291 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289302 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289314 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289325 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289337 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289349 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289361 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289374 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289385 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289397 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289411 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289425 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289438 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289450 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289462 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289475 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289489 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289504 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" volumeName="kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289516 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289527 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289538 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289557 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289569 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="925f1c65-6136-48ba-85aa-3a3b50560753" volumeName="kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289582 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289594 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289608 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289640 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289654 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="496e6271-fb68-4057-954e-a0d97a4afa3f" volumeName="kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289669 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289683 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="efdd0498-1daa-4136-9a4a-3b948c2293fc" volumeName="kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289696 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289708 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" volumeName="kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289720 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289733 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="22c825df-677d-4ca6-82db-3454ed06e783" volumeName="kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289744 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289758 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289778 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289791 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289828 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" volumeName="kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289859 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289875 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289889 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="37a5e44f-9a88-4405-be8a-b645485e7312" volumeName="kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289903 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289916 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5b88f790-22fa-440e-b583-365168c0b23d" volumeName="kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289930 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289943 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" volumeName="kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289955 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289967 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289979 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5225d0e4-402f-4861-b410-819f433b1803" volumeName="kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.289991 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5441d097-087c-4d9a-baa8-b210afa90fc9" volumeName="kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290004 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6509e943-70c6-444c-bc41-48a544e36fbd" volumeName="kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290016 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a31745f5-9847-4afe-82a5-3161cc66ca93" volumeName="kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290028 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290041 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290053 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290066 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290077 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290095 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290106 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="e7e6199b-1264-4501-8953-767f51328d08" volumeName="kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290119 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290132 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290145 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290158 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290171 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="d75a4c96-2883-4a0b-bab2-0fab2b6c0b49" volumeName="kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290184 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" volumeName="kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290196 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290212 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290227 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290239 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290252 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290264 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290277 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290290 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290303 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290317 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290329 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7bb08738-c794-4ee8-9972-3a62ca171029" volumeName="kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.290343 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.292997 4812 reconstruct.go:144] "Volume is marked device as uncertain and added into the actual state" volumeName="kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" deviceMountPath="/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293036 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293071 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293083 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6731426b-95fe-49ff-bb5f-40441049fde2" volumeName="kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293094 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293106 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293802 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293912 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293944 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.293972 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294003 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294025 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" volumeName="kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294046 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" volumeName="kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294067 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" volumeName="kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294090 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294114 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" volumeName="kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294141 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1386a44e-36a2-460c-96d0-0359d2b6f0f5" volumeName="kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294161 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294181 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6ea678ab-3438-413e-bfe3-290ae7725660" volumeName="kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294202 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7539238d-5fe0-46ed-884e-1c3b566537ec" volumeName="kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294223 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="87cf06ed-a83f-41a7-828d-70653580a8cb" volumeName="kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294242 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294261 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="fda69060-fa79-4696-b1a6-7980f124bf7c" volumeName="kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294280 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294302 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294320 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b78653f-4ff9-4508-8672-245ed9b561e3" volumeName="kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294341 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49ef4625-1d3a-4a9f-b595-c2433d32326d" volumeName="kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294360 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b6312bbd-5731-4ea0-a20f-81d5a57df44a" volumeName="kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294379 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="7583ce53-e0fe-4a16-9e4d-50516596a136" volumeName="kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294398 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="8f668bae-612b-4b75-9490-919e737c6a3b" volumeName="kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294418 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="ef543e1b-8068-4ea3-b32a-61027b32e95d" volumeName="kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294437 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294455 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="20b0d48f-5fd6-431c-a545-e3c800c7b866" volumeName="kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294473 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="31d8b7a1-420e-4252-a5b7-eebe8a111292" volumeName="kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294491 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294509 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294529 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" volumeName="kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294548 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294567 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="01ab3dd5-8196-46d0-ad33-122e2ca51def" volumeName="kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.294586 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="25e176fe-21b4-4974-b1ed-c8b94f112a7f" volumeName="kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297703 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297745 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="4bb40260-dbaa-4fb0-84df-5e680505d512" volumeName="kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297767 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d4552c7-cd75-42dd-8880-30dd377c49a4" volumeName="kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297787 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="5fe579f8-e8a6-4643-bce5-a661393c4dde" volumeName="kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297806 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="9d751cbb-f2e2-430d-9754-c882a5e924a5" volumeName="kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297824 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297877 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1d611f23-29be-4491-8495-bee1670e935f" volumeName="kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297897 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="3ab1a177-2de0-46d9-b765-d0d0649bb42e" volumeName="kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297915 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="43509403-f426-496e-be36-56cef71462f5" volumeName="kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297935 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" volumeName="kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297952 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297970 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="bf126b07-da06-4140-9a57-dfd54fc6b486" volumeName="kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.297990 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" volumeName="kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298007 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="09efc573-dbb6-4249-bd59-9b87aba8dd28" volumeName="kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298025 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="0b574797-001e-440a-8f4e-c0be86edad0f" volumeName="kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298043 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298060 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" volumeName="kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298081 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298100 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="1bf7eb37-55a3-4c65-b768-a94c82151e69" volumeName="kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298119 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="57a731c4-ef35-47a8-b875-bfb08a7f8011" volumeName="kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298137 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="6402fda4-df10-493c-b4e5-d0569419652d" volumeName="kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298155 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="c03ee662-fb2f-4fc4-a2c1-af487c19d254" volumeName="kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298176 4812 reconstruct.go:130] "Volume is marked as uncertain and added into the actual state" pod="" podName="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" volumeName="kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" seLinuxMountContext="" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298194 4812 reconstruct.go:97] "Volume reconstruction finished" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.298208 4812 reconciler.go:26] "Reconciler: start to sync state" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.304504 4812 manager.go:324] Recovery completed Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.321760 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.323437 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.323503 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.323529 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.324624 4812 cpu_manager.go:225] "Starting CPU manager" policy="none" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.324662 4812 cpu_manager.go:226] "Reconciling" reconcilePeriod="10s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.324697 4812 state_mem.go:36] "Initialized new in-memory state store" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.335442 4812 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv4" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.338165 4812 kubelet_network_linux.go:50] "Initialized iptables rules." protocol="IPv6" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.338247 4812 status_manager.go:217] "Starting to sync pod status with apiserver" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.338290 4812 kubelet.go:2335] "Starting kubelet main sync loop" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.338375 4812 kubelet.go:2359] "Skipping pod synchronization" err="[container runtime status check may not have completed yet, PLEG is not healthy: pleg has yet to be successful]" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.341751 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.341854 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.350502 4812 policy_none.go:49] "None policy: Start" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.351428 4812 memory_manager.go:170] "Starting memorymanager" policy="None" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.351459 4812 state_mem.go:35] "Initializing new in-memory state store" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.372544 4812 kubelet_node_status.go:503] "Error getting the current node from lister" err="node \"crc\" not found" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.413195 4812 manager.go:334] "Starting Device Plugin manager" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.413258 4812 manager.go:513] "Failed to read data from checkpoint" checkpoint="kubelet_internal_checkpoint" err="checkpoint is not found" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.413278 4812 server.go:79] "Starting device plugin registration server" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.413782 4812 eviction_manager.go:189] "Eviction manager: starting control loop" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.413809 4812 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.414071 4812 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.414186 4812 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.414207 4812 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.425900 4812 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.439136 4812 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc","openshift-etcd/etcd-crc","openshift-kube-apiserver/kube-apiserver-crc","openshift-kube-controller-manager/kube-controller-manager-crc","openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.439207 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.440323 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.440385 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.440398 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.440575 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.440744 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.440775 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441387 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441411 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441422 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441630 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441653 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441661 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441745 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441951 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.441992 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.442396 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.442423 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.442461 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.442563 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.442666 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.442694 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.443694 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.443742 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.443757 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.443960 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444451 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444490 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444617 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444625 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444632 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444652 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444662 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.444998 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.445045 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.445062 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.445315 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.445366 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.446602 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.446664 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.446683 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.447260 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.447305 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.447321 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.477016 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="400ms" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.500938 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501021 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501069 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501120 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501202 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501249 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501305 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501346 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501384 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501421 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501464 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501506 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501549 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501590 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.501647 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.514263 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.515685 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.515743 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.515765 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.515795 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.516356 4812 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602718 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602781 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602809 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602833 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602904 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602932 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602959 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602979 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603000 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-cert-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602988 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603110 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603121 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-cert-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.602978 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603050 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-resource-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603158 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f614b9022728cf315e60c057852e563e-cert-dir\") pod \"kube-controller-manager-crc\" (UID: \"f614b9022728cf315e60c057852e563e\") " pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603140 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"data-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-data-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603119 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/3dcd261975c3d6b9a6ad6367fd4facd3-resource-dir\") pod \"openshift-kube-scheduler-crc\" (UID: \"3dcd261975c3d6b9a6ad6367fd4facd3\") " pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603215 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603251 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603276 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603286 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"static-pod-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-static-pod-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603312 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603337 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603337 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"usr-local-bin\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-usr-local-bin\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603356 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-resource-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603366 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603411 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d1b160f5dda77d281dd8e69ec8d817f9-etc-kube\") pod \"kube-rbac-proxy-crio-crc\" (UID: \"d1b160f5dda77d281dd8e69ec8d817f9\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603412 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603448 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-dir\" (UniqueName: \"kubernetes.io/host-path/2139d3e2895fc6797b9c76a1b4c9886d-log-dir\") pod \"etcd-crc\" (UID: \"2139d3e2895fc6797b9c76a1b4c9886d\") " pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.603496 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.717146 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.718604 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.718646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.718663 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.718697 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.719189 4812 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.768955 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.788129 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.808569 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd/etcd-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.812713 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1b160f5dda77d281dd8e69ec8d817f9.slice/crio-033fcfdd5fb2cc08fb6bb9b12d263e0ca1106258efebbe46adac361870917776 WatchSource:0}: Error finding container 033fcfdd5fb2cc08fb6bb9b12d263e0ca1106258efebbe46adac361870917776: Status 404 returned error can't find the container with id 033fcfdd5fb2cc08fb6bb9b12d263e0ca1106258efebbe46adac361870917776 Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.830886 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf4b27818a5e8e43d0dc095d08835c792.slice/crio-7f6aac9fbfe14cdcd93a0c5aa909fd0507513245d202fadee01ddb52bc3fe09d WatchSource:0}: Error finding container 7f6aac9fbfe14cdcd93a0c5aa909fd0507513245d202fadee01ddb52bc3fe09d: Status 404 returned error can't find the container with id 7f6aac9fbfe14cdcd93a0c5aa909fd0507513245d202fadee01ddb52bc3fe09d Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.832805 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: I0131 04:26:32.845720 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.852558 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3dcd261975c3d6b9a6ad6367fd4facd3.slice/crio-273bb55ceac47339ac7bf14813fa3b84b0c9d77f295455b58f5d0b53b9ca5c0e WatchSource:0}: Error finding container 273bb55ceac47339ac7bf14813fa3b84b0c9d77f295455b58f5d0b53b9ca5c0e: Status 404 returned error can't find the container with id 273bb55ceac47339ac7bf14813fa3b84b0c9d77f295455b58f5d0b53b9ca5c0e Jan 31 04:26:32 crc kubenswrapper[4812]: W0131 04:26:32.873782 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf614b9022728cf315e60c057852e563e.slice/crio-a86ad09b0056a9f418c757be95fd191bea316157abee4de43b79598dbf1646c1 WatchSource:0}: Error finding container a86ad09b0056a9f418c757be95fd191bea316157abee4de43b79598dbf1646c1: Status 404 returned error can't find the container with id a86ad09b0056a9f418c757be95fd191bea316157abee4de43b79598dbf1646c1 Jan 31 04:26:32 crc kubenswrapper[4812]: E0131 04:26:32.878103 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="800ms" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.119623 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.120968 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.121006 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.121019 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.121044 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.121571 4812 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Jan 31 04:26:33 crc kubenswrapper[4812]: W0131 04:26:33.192797 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.192904 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.270443 4812 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.271480 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-10 03:53:18.957455934 +0000 UTC Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.342407 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"7f6aac9fbfe14cdcd93a0c5aa909fd0507513245d202fadee01ddb52bc3fe09d"} Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.343517 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"033fcfdd5fb2cc08fb6bb9b12d263e0ca1106258efebbe46adac361870917776"} Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.344362 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"a86ad09b0056a9f418c757be95fd191bea316157abee4de43b79598dbf1646c1"} Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.345497 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"273bb55ceac47339ac7bf14813fa3b84b0c9d77f295455b58f5d0b53b9ca5c0e"} Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.346261 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"24c3aca4985b6170c31d00830ed8cd3f123432b417c8e6249d0447e416af77c7"} Jan 31 04:26:33 crc kubenswrapper[4812]: W0131 04:26:33.450210 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.450287 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:33 crc kubenswrapper[4812]: W0131 04:26:33.543551 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.543634 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:33 crc kubenswrapper[4812]: W0131 04:26:33.669569 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.669693 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.679222 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="1.6s" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.922072 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.924947 4812 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/default/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{crc.188fb64672b7e746 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:crc,UID:crc,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 04:26:32.268515142 +0000 UTC m=+0.763536837,LastTimestamp:2026-01-31 04:26:32.268515142 +0000 UTC m=+0.763536837,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.925170 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.925241 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.925263 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:33 crc kubenswrapper[4812]: I0131 04:26:33.925312 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:33 crc kubenswrapper[4812]: E0131 04:26:33.925988 4812 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.271006 4812 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.271935 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 16:36:45.373751966 +0000 UTC Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.285738 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 04:26:34 crc kubenswrapper[4812]: E0131 04:26:34.287746 4812 certificate_manager.go:562] "Unhandled Error" err="kubernetes.io/kube-apiserver-client-kubelet: Failed while requesting a signed certificate from the control plane: cannot create certificate signing request: Post \"https://api-int.crc.testing:6443/apis/certificates.k8s.io/v1/certificatesigningrequests\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.352938 4812 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3" exitCode=0 Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.353056 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerDied","Data":"412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.353775 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.355214 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.355281 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.355301 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.358453 4812 generic.go:334] "Generic (PLEG): container finished" podID="d1b160f5dda77d281dd8e69ec8d817f9" containerID="ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f" exitCode=0 Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.358536 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerDied","Data":"ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.358673 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.358732 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.359978 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.360026 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.360043 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.360399 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.360445 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.360462 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.362464 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.362518 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.362540 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.366415 4812 generic.go:334] "Generic (PLEG): container finished" podID="3dcd261975c3d6b9a6ad6367fd4facd3" containerID="47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e" exitCode=0 Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.366506 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerDied","Data":"47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.366635 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.367936 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.367975 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.367991 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.370349 4812 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc" exitCode=0 Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.370422 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc"} Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.370511 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.372039 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.372101 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:34 crc kubenswrapper[4812]: I0131 04:26:34.372119 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.270804 4812 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.272916 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 20:23:10.020033709 +0000 UTC Jan 31 04:26:35 crc kubenswrapper[4812]: E0131 04:26:35.279913 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="3.2s" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.382722 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.382860 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.384062 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.384099 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.384111 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.387831 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.388009 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.388036 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" event={"ID":"3dcd261975c3d6b9a6ad6367fd4facd3","Type":"ContainerStarted","Data":"4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.388195 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.389680 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.389721 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.389733 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.391057 4812 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa" exitCode=0 Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.391125 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.391336 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.392912 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.392941 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.392953 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.394546 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.394597 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.394613 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.394626 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.396649 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" event={"ID":"d1b160f5dda77d281dd8e69ec8d817f9","Type":"ContainerStarted","Data":"bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d"} Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.396741 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.398071 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.398099 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.398110 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.526436 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.527512 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.527541 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.527549 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.527575 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:35 crc kubenswrapper[4812]: E0131 04:26:35.527944 4812 kubelet_node_status.go:99] "Unable to register node with API server" err="Post \"https://api-int.crc.testing:6443/api/v1/nodes\": dial tcp 38.102.83.238:6443: connect: connection refused" node="crc" Jan 31 04:26:35 crc kubenswrapper[4812]: W0131 04:26:35.727521 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.CSIDriver: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0": dial tcp 38.102.83.238:6443: connect: connection refused Jan 31 04:26:35 crc kubenswrapper[4812]: E0131 04:26:35.727610 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.CSIDriver: failed to list *v1.CSIDriver: Get \"https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csidrivers?limit=500&resourceVersion=0\": dial tcp 38.102.83.238:6443: connect: connection refused" logger="UnhandledError" Jan 31 04:26:35 crc kubenswrapper[4812]: I0131 04:26:35.992146 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.273214 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 14:03:04.640591234 +0000 UTC Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.403511 4812 generic.go:334] "Generic (PLEG): container finished" podID="2139d3e2895fc6797b9c76a1b4c9886d" containerID="2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8" exitCode=0 Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.403615 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerDied","Data":"2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8"} Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.403685 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.404966 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.405023 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.405047 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.408536 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"f4b27818a5e8e43d0dc095d08835c792","Type":"ContainerStarted","Data":"ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2"} Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.408570 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.408640 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.408660 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.408735 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.408675 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410450 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410480 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410458 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410491 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410508 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410526 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410528 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410557 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410570 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:36 crc kubenswrapper[4812]: I0131 04:26:36.410595 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.273871 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 15:57:31.81253072 +0000 UTC Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417119 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443"} Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417199 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d"} Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417227 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76"} Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417237 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417322 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417252 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a"} Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.417450 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.418810 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.418884 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.418905 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.419004 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.419047 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.419064 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.512519 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.512810 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.514255 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.514333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:37 crc kubenswrapper[4812]: I0131 04:26:37.514359 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.274616 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 15:29:36.115557877 +0000 UTC Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.298868 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Rotating certificates Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.423728 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd/etcd-crc" event={"ID":"2139d3e2895fc6797b9c76a1b4c9886d","Type":"ContainerStarted","Data":"e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f"} Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.423815 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.423831 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.425010 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.425051 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.425064 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.425066 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.425105 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.425122 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.570612 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.729131 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.730904 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.730958 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.730976 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:38 crc kubenswrapper[4812]: I0131 04:26:38.731010 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.166461 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.166606 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.167771 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.167811 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.167823 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.179422 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.204895 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-etcd/etcd-crc" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.276255 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 13:50:11.050746219 +0000 UTC Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.426956 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.427012 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.427047 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.427064 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428658 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428705 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428721 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428726 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428763 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428768 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428783 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428795 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:39 crc kubenswrapper[4812]: I0131 04:26:39.428806 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.276591 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 23:13:04.209466886 +0000 UTC Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.430147 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.430207 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.431363 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.431409 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.431415 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.431446 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.431467 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.431424 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.517311 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.517537 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.518954 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.518990 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:40 crc kubenswrapper[4812]: I0131 04:26:40.519002 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:41 crc kubenswrapper[4812]: I0131 04:26:41.276994 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 09:30:11.681250878 +0000 UTC Jan 31 04:26:42 crc kubenswrapper[4812]: I0131 04:26:42.277161 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 19:32:55.384355299 +0000 UTC Jan 31 04:26:42 crc kubenswrapper[4812]: E0131 04:26:42.426492 4812 eviction_manager.go:285] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"crc\" not found" Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.278296 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 23:34:23.038329186 +0000 UTC Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.610385 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.610668 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.612897 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.612985 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.613011 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:43 crc kubenswrapper[4812]: I0131 04:26:43.616761 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.278617 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-22 17:15:11.829497326 +0000 UTC Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.441409 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.442349 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.442557 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.442762 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.807605 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-etcd/etcd-crc" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.808246 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.810212 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.810290 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:44 crc kubenswrapper[4812]: I0131 04:26:44.810314 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:45 crc kubenswrapper[4812]: I0131 04:26:45.279327 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-19 20:50:49.227349759 +0000 UTC Jan 31 04:26:45 crc kubenswrapper[4812]: W0131 04:26:45.814115 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Node: Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 31 04:26:45 crc kubenswrapper[4812]: I0131 04:26:45.814192 4812 trace.go:236] Trace[1569079019]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 04:26:35.813) (total time: 10000ms): Jan 31 04:26:45 crc kubenswrapper[4812]: Trace[1569079019]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10000ms (04:26:45.814) Jan 31 04:26:45 crc kubenswrapper[4812]: Trace[1569079019]: [10.000974651s] [10.000974651s] END Jan 31 04:26:45 crc kubenswrapper[4812]: E0131 04:26:45.814213 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Node: failed to list *v1.Node: Get \"https://api-int.crc.testing:6443/api/v1/nodes?fieldSelector=metadata.name%3Dcrc&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 31 04:26:46 crc kubenswrapper[4812]: W0131 04:26:46.054192 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.Service: Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.054472 4812 trace.go:236] Trace[389305195]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 04:26:36.052) (total time: 10001ms): Jan 31 04:26:46 crc kubenswrapper[4812]: Trace[389305195]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:26:46.054) Jan 31 04:26:46 crc kubenswrapper[4812]: Trace[389305195]: [10.001682545s] [10.001682545s] END Jan 31 04:26:46 crc kubenswrapper[4812]: E0131 04:26:46.054603 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.Service: failed to list *v1.Service: Get \"https://api-int.crc.testing:6443/api/v1/services?fieldSelector=spec.clusterIP%21%3DNone&limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.271723 4812 csi_plugin.go:884] Failed to contact API server when waiting for CSINode publishing: Get "https://api-int.crc.testing:6443/apis/storage.k8s.io/v1/csinodes/crc?resourceVersion=0": net/http: TLS handshake timeout Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.279713 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 00:51:55.219057144 +0000 UTC Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.610794 4812 patch_prober.go:28] interesting pod/kube-controller-manager-crc container/cluster-policy-controller namespace/openshift-kube-controller-manager: Startup probe status=failure output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.610945 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podUID="f614b9022728cf315e60c057852e563e" containerName="cluster-policy-controller" probeResult="failure" output="Get \"https://192.168.126.11:10357/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 04:26:46 crc kubenswrapper[4812]: W0131 04:26:46.665215 4812 reflector.go:561] k8s.io/client-go/informers/factory.go:160: failed to list *v1.RuntimeClass: Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.665328 4812 trace.go:236] Trace[1735494666]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 04:26:36.664) (total time: 10001ms): Jan 31 04:26:46 crc kubenswrapper[4812]: Trace[1735494666]: ---"Objects listed" error:Get "https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0": net/http: TLS handshake timeout 10001ms (04:26:46.665) Jan 31 04:26:46 crc kubenswrapper[4812]: Trace[1735494666]: [10.00114817s] [10.00114817s] END Jan 31 04:26:46 crc kubenswrapper[4812]: E0131 04:26:46.665363 4812 reflector.go:158] "Unhandled Error" err="k8s.io/client-go/informers/factory.go:160: Failed to watch *v1.RuntimeClass: failed to list *v1.RuntimeClass: Get \"https://api-int.crc.testing:6443/apis/node.k8s.io/v1/runtimeclasses?limit=500&resourceVersion=0\": net/http: TLS handshake timeout" logger="UnhandledError" Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.680248 4812 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.680324 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.685804 4812 patch_prober.go:28] interesting pod/kube-apiserver-crc container/kube-apiserver namespace/openshift-kube-apiserver: Startup probe status=failure output="HTTP probe failed with statuscode: 403" start-of-body={"kind":"Status","apiVersion":"v1","metadata":{},"status":"Failure","message":"forbidden: User \"system:anonymous\" cannot get path \"/livez\"","reason":"Forbidden","details":{},"code":403} Jan 31 04:26:46 crc kubenswrapper[4812]: I0131 04:26:46.685957 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" probeResult="failure" output="HTTP probe failed with statuscode: 403" Jan 31 04:26:47 crc kubenswrapper[4812]: I0131 04:26:47.280187 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 04:42:27.401409218 +0000 UTC Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.280320 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 06:42:59.601042673 +0000 UTC Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.580679 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.580953 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.582205 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.582242 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.582255 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:48 crc kubenswrapper[4812]: I0131 04:26:48.588342 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.031979 4812 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.261704 4812 apiserver.go:52] "Watching apiserver" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.269106 4812 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.269472 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-network-operator/iptables-alerter-4ln5h","openshift-network-operator/network-operator-58b4c7f79c-55gtf","openshift-network-console/networking-console-plugin-85b44fc459-gdk6g","openshift-network-diagnostics/network-check-source-55646444c4-trplf","openshift-network-diagnostics/network-check-target-xd92c","openshift-network-node-identity/network-node-identity-vrzqb"] Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.269881 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.270152 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.270489 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.270383 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.270668 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:49 crc kubenswrapper[4812]: E0131 04:26:49.270778 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.270896 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:49 crc kubenswrapper[4812]: E0131 04:26:49.270896 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:26:49 crc kubenswrapper[4812]: E0131 04:26:49.271195 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.273713 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.273876 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.274158 4812 desired_state_of_world_populator.go:154] "Finished populating initial desired state of world" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.273922 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.273948 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.274006 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.274030 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.274551 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.274776 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.275125 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.280961 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 14:29:39.270985452 +0000 UTC Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.307743 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.324835 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.343803 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.357634 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.374346 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.392792 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.406647 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.454986 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:26:49 crc kubenswrapper[4812]: I0131 04:26:49.473284 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 04:26:50 crc kubenswrapper[4812]: I0131 04:26:50.282184 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 07:44:00.673419013 +0000 UTC Jan 31 04:26:50 crc kubenswrapper[4812]: I0131 04:26:50.457735 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.282528 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-13 16:35:17.028130753 +0000 UTC Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.339280 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.339287 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.339444 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.339617 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.339750 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.339909 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.681598 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": context deadline exceeded" interval="6.4s" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.684639 4812 trace.go:236] Trace[1122361477]: "Reflector ListAndWatch" name:k8s.io/client-go/informers/factory.go:160 (31-Jan-2026 04:26:39.137) (total time: 12546ms): Jan 31 04:26:51 crc kubenswrapper[4812]: Trace[1122361477]: ---"Objects listed" error: 12546ms (04:26:51.684) Jan 31 04:26:51 crc kubenswrapper[4812]: Trace[1122361477]: [12.546599418s] [12.546599418s] END Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.684688 4812 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.686074 4812 reconstruct.go:205] "DevicePaths of reconstructed volumes updated" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.686929 4812 kubelet_node_status.go:99] "Unable to register node with API server" err="nodes \"crc\" is forbidden: autoscaling.openshift.io/ManagedNode infra config cache not synchronized" node="crc" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.699314 4812 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.730270 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.759699 4812 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.766750 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786545 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786592 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786618 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786642 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786666 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786700 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786721 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786746 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786765 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786784 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786803 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786822 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") pod \"5b88f790-22fa-440e-b583-365168c0b23d\" (UID: \"5b88f790-22fa-440e-b583-365168c0b23d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786862 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786883 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786904 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786950 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786973 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.786996 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787019 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787042 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787063 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787084 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787088 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4" (OuterVolumeSpecName: "kube-api-access-w4xd4") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "kube-api-access-w4xd4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787105 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787128 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") pod \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\" (UID: \"cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787152 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787174 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787196 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787218 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787242 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787266 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787288 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787289 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds" (OuterVolumeSpecName: "kube-api-access-w9rds") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "kube-api-access-w9rds". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787312 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787336 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787361 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787382 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787404 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787428 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787447 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787474 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787494 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787513 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787530 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787548 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787564 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787588 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787608 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787627 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787629 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787652 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787742 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787772 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787801 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787823 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787873 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787897 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787920 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787944 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787969 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787990 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788011 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788032 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788059 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788081 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788103 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788127 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788149 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788174 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788202 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788232 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788268 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788292 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788320 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788354 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788396 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788418 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788439 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788462 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788485 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788508 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788532 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") pod \"49ef4625-1d3a-4a9f-b595-c2433d32326d\" (UID: \"49ef4625-1d3a-4a9f-b595-c2433d32326d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788556 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788578 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788601 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788623 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787871 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788823 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp" (OuterVolumeSpecName: "kube-api-access-ngvvp") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "kube-api-access-ngvvp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788816 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788876 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb" (OuterVolumeSpecName: "kube-api-access-mg5zb") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "kube-api-access-mg5zb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788054 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788172 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca" (OuterVolumeSpecName: "service-ca") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788217 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config" (OuterVolumeSpecName: "config") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788244 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788311 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788429 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh" (OuterVolumeSpecName: "kube-api-access-x4zgh") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "kube-api-access-x4zgh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788434 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782" (OuterVolumeSpecName: "kube-api-access-pj782") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "kube-api-access-pj782". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788543 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788614 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788627 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788631 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn" (OuterVolumeSpecName: "kube-api-access-jkwtn") pod "5b88f790-22fa-440e-b583-365168c0b23d" (UID: "5b88f790-22fa-440e-b583-365168c0b23d"). InnerVolumeSpecName "kube-api-access-jkwtn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788914 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config" (OuterVolumeSpecName: "config") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789105 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789121 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789145 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8" (OuterVolumeSpecName: "kube-api-access-6ccd8") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "kube-api-access-6ccd8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789293 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789332 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85" (OuterVolumeSpecName: "kube-api-access-x2m85") pod "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" (UID: "cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d"). InnerVolumeSpecName "kube-api-access-x2m85". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789418 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config" (OuterVolumeSpecName: "config") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789481 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789514 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls" (OuterVolumeSpecName: "image-registry-operator-tls") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "image-registry-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.789616 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790032 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7" (OuterVolumeSpecName: "kube-api-access-kfwg7") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "kube-api-access-kfwg7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.788647 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790258 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790286 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790289 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790366 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790400 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790430 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790454 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790478 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790502 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790528 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790551 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790572 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790575 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790610 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790594 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790716 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790781 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790874 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790973 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") pod \"925f1c65-6136-48ba-85aa-3a3b50560753\" (UID: \"925f1c65-6136-48ba-85aa-3a3b50560753\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791028 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791082 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791138 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") pod \"496e6271-fb68-4057-954e-a0d97a4afa3f\" (UID: \"496e6271-fb68-4057-954e-a0d97a4afa3f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791188 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791238 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791289 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791343 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") pod \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\" (UID: \"3ab1a177-2de0-46d9-b765-d0d0649bb42e\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791396 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791448 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") pod \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\" (UID: \"3cb93b32-e0ae-4377-b9c8-fdb9842c6d59\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791501 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791554 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") pod \"5441d097-087c-4d9a-baa8-b210afa90fc9\" (UID: \"5441d097-087c-4d9a-baa8-b210afa90fc9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791606 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791654 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791705 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") pod \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\" (UID: \"bd23aa5c-e532-4e53-bccf-e79f130c5ae8\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791749 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791796 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791878 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") pod \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\" (UID: \"b6cd30de-2eeb-49a2-ab40-9167f4560ff5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791936 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791986 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792037 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") pod \"7539238d-5fe0-46ed-884e-1c3b566537ec\" (UID: \"7539238d-5fe0-46ed-884e-1c3b566537ec\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792093 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792147 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792200 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792259 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") pod \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\" (UID: \"f88749ec-7931-4ee7-b3fc-1ec5e11f92e9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792314 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792366 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") pod \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\" (UID: \"25e176fe-21b4-4974-b1ed-c8b94f112a7f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792415 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792467 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") pod \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\" (UID: \"bc5039c0-ea34-426b-a2b7-fbbc87b49a6d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792521 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792573 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") pod \"0b574797-001e-440a-8f4e-c0be86edad0f\" (UID: \"0b574797-001e-440a-8f4e-c0be86edad0f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792623 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792687 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792747 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792800 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") pod \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\" (UID: \"210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792893 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792950 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793011 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") pod \"6731426b-95fe-49ff-bb5f-40441049fde2\" (UID: \"6731426b-95fe-49ff-bb5f-40441049fde2\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793070 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793128 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793187 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") pod \"bf126b07-da06-4140-9a57-dfd54fc6b486\" (UID: \"bf126b07-da06-4140-9a57-dfd54fc6b486\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793241 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") pod \"fda69060-fa79-4696-b1a6-7980f124bf7c\" (UID: \"fda69060-fa79-4696-b1a6-7980f124bf7c\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793294 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793345 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793400 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") pod \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\" (UID: \"8cea82b4-6893-4ddc-af9f-1bb5ae425c5b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793452 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793502 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") pod \"6402fda4-df10-493c-b4e5-d0569419652d\" (UID: \"6402fda4-df10-493c-b4e5-d0569419652d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793551 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793603 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") pod \"a31745f5-9847-4afe-82a5-3161cc66ca93\" (UID: \"a31745f5-9847-4afe-82a5-3161cc66ca93\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793659 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") pod \"7bb08738-c794-4ee8-9972-3a62ca171029\" (UID: \"7bb08738-c794-4ee8-9972-3a62ca171029\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793808 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") pod \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\" (UID: \"308be0ea-9f5f-4b29-aeb1-5abd31a0b17b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794405 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794476 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") pod \"efdd0498-1daa-4136-9a4a-3b948c2293fc\" (UID: \"efdd0498-1daa-4136-9a4a-3b948c2293fc\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794534 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") pod \"44663579-783b-4372-86d6-acf235a62d72\" (UID: \"44663579-783b-4372-86d6-acf235a62d72\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794599 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") pod \"87cf06ed-a83f-41a7-828d-70653580a8cb\" (UID: \"87cf06ed-a83f-41a7-828d-70653580a8cb\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794651 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794704 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794761 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") pod \"e7e6199b-1264-4501-8953-767f51328d08\" (UID: \"e7e6199b-1264-4501-8953-767f51328d08\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794820 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794920 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") pod \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\" (UID: \"b6312bbd-5731-4ea0-a20f-81d5a57df44a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795011 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795072 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") pod \"1d611f23-29be-4491-8495-bee1670e935f\" (UID: \"1d611f23-29be-4491-8495-bee1670e935f\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795125 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795177 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") pod \"0b78653f-4ff9-4508-8672-245ed9b561e3\" (UID: \"0b78653f-4ff9-4508-8672-245ed9b561e3\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795299 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795722 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795760 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") pod \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\" (UID: \"96b93a3a-6083-4aea-8eab-fe1aa8245ad9\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795791 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") pod \"6ea678ab-3438-413e-bfe3-290ae7725660\" (UID: \"6ea678ab-3438-413e-bfe3-290ae7725660\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795823 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") pod \"20b0d48f-5fd6-431c-a545-e3c800c7b866\" (UID: \"20b0d48f-5fd6-431c-a545-e3c800c7b866\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795885 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") pod \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\" (UID: \"b11524ee-3fca-4b1b-9cdf-6da289fdbc7d\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795921 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795950 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795982 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") pod \"4bb40260-dbaa-4fb0-84df-5e680505d512\" (UID: \"4bb40260-dbaa-4fb0-84df-5e680505d512\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796015 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796048 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796080 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796114 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") pod \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\" (UID: \"a0128f3a-b052-44ed-a84e-c4c8aaf17c13\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796147 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796181 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") pod \"9d4552c7-cd75-42dd-8880-30dd377c49a4\" (UID: \"9d4552c7-cd75-42dd-8880-30dd377c49a4\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796213 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") pod \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\" (UID: \"c03ee662-fb2f-4fc4-a2c1-af487c19d254\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796249 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") pod \"01ab3dd5-8196-46d0-ad33-122e2ca51def\" (UID: \"01ab3dd5-8196-46d0-ad33-122e2ca51def\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796282 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") pod \"7583ce53-e0fe-4a16-9e4d-50516596a136\" (UID: \"7583ce53-e0fe-4a16-9e4d-50516596a136\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796318 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") pod \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\" (UID: \"1386a44e-36a2-460c-96d0-0359d2b6f0f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796350 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796374 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") pod \"22c825df-677d-4ca6-82db-3454ed06e783\" (UID: \"22c825df-677d-4ca6-82db-3454ed06e783\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796396 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") pod \"09efc573-dbb6-4249-bd59-9b87aba8dd28\" (UID: \"09efc573-dbb6-4249-bd59-9b87aba8dd28\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796418 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") pod \"6509e943-70c6-444c-bc41-48a544e36fbd\" (UID: \"6509e943-70c6-444c-bc41-48a544e36fbd\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796441 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") pod \"5225d0e4-402f-4861-b410-819f433b1803\" (UID: \"5225d0e4-402f-4861-b410-819f433b1803\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796464 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") pod \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\" (UID: \"09ae3b1a-e8e7-4524-b54b-61eab6f9239a\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796495 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796525 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") pod \"5fe579f8-e8a6-4643-bce5-a661393c4dde\" (UID: \"5fe579f8-e8a6-4643-bce5-a661393c4dde\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796557 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") pod \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\" (UID: \"49c341d1-5089-4bc2-86a0-a5e165cfcc6b\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796591 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") pod \"1bf7eb37-55a3-4c65-b768-a94c82151e69\" (UID: \"1bf7eb37-55a3-4c65-b768-a94c82151e69\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796622 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") pod \"31d8b7a1-420e-4252-a5b7-eebe8a111292\" (UID: \"31d8b7a1-420e-4252-a5b7-eebe8a111292\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796650 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") pod \"43509403-f426-496e-be36-56cef71462f5\" (UID: \"43509403-f426-496e-be36-56cef71462f5\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796681 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") pod \"57a731c4-ef35-47a8-b875-bfb08a7f8011\" (UID: \"57a731c4-ef35-47a8-b875-bfb08a7f8011\") " Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796742 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796784 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796816 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796884 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796923 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796961 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796995 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797030 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797068 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797101 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797133 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797170 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797206 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797239 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797338 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9rds\" (UniqueName: \"kubernetes.io/projected/20b0d48f-5fd6-431c-a545-e3c800c7b866-kube-api-access-w9rds\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797363 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797382 4812 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0b78653f-4ff9-4508-8672-245ed9b561e3-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797401 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jkwtn\" (UniqueName: \"kubernetes.io/projected/5b88f790-22fa-440e-b583-365168c0b23d-kube-api-access-jkwtn\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797419 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w4xd4\" (UniqueName: \"kubernetes.io/projected/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-kube-api-access-w4xd4\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797435 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797455 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ngvvp\" (UniqueName: \"kubernetes.io/projected/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-kube-api-access-ngvvp\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797473 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x4zgh\" (UniqueName: \"kubernetes.io/projected/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-kube-api-access-x4zgh\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797490 4812 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797508 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ccd8\" (UniqueName: \"kubernetes.io/projected/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-kube-api-access-6ccd8\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797526 4812 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/5b88f790-22fa-440e-b583-365168c0b23d-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797543 4812 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797561 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797579 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797596 4812 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797614 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pj782\" (UniqueName: \"kubernetes.io/projected/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-kube-api-access-pj782\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799473 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799547 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799565 4812 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799588 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2m85\" (UniqueName: \"kubernetes.io/projected/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d-kube-api-access-x2m85\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799603 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kfwg7\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-kube-api-access-kfwg7\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799622 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799638 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799654 4812 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/87cf06ed-a83f-41a7-828d-70653580a8cb-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799667 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9d4552c7-cd75-42dd-8880-30dd377c49a4-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799681 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799695 4812 reconciler_common.go:293] "Volume detached for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/bf126b07-da06-4140-9a57-dfd54fc6b486-image-registry-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799709 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799723 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/1386a44e-36a2-460c-96d0-0359d2b6f0f5-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799737 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9d4552c7-cd75-42dd-8880-30dd377c49a4-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799751 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mg5zb\" (UniqueName: \"kubernetes.io/projected/6402fda4-df10-493c-b4e5-d0569419652d-kube-api-access-mg5zb\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.799765 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.800238 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.787826 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities" (OuterVolumeSpecName: "utilities") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.790720 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs" (OuterVolumeSpecName: "tmpfs") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "tmpfs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791058 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791094 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj" (OuterVolumeSpecName: "kube-api-access-4d4hj") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "kube-api-access-4d4hj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791264 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca" (OuterVolumeSpecName: "client-ca") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791439 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8" (OuterVolumeSpecName: "kube-api-access-wxkg8") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "kube-api-access-wxkg8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791764 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.791858 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit" (OuterVolumeSpecName: "audit") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "audit". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792011 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792349 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv" (OuterVolumeSpecName: "kube-api-access-d4lsv") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "kube-api-access-d4lsv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792478 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config" (OuterVolumeSpecName: "mcd-auth-proxy-config") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "mcd-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792515 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792663 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs" (OuterVolumeSpecName: "webhook-certs") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "webhook-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792817 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792808 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs" (OuterVolumeSpecName: "metrics-certs") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "metrics-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792905 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.792940 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert" (OuterVolumeSpecName: "ovn-control-plane-metrics-cert") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovn-control-plane-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793110 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793417 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793592 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn" (OuterVolumeSpecName: "kube-api-access-lz9wn") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "kube-api-access-lz9wn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793617 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls" (OuterVolumeSpecName: "samples-operator-tls") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "samples-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793840 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.793993 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794058 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth" (OuterVolumeSpecName: "stats-auth") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "stats-auth". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794160 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794307 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794535 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88" (OuterVolumeSpecName: "kube-api-access-lzf88") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "kube-api-access-lzf88". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794524 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794816 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh" (OuterVolumeSpecName: "kube-api-access-x7zkh") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "kube-api-access-x7zkh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.801514 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg" (OuterVolumeSpecName: "kube-api-access-dbsvg") pod "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" (UID: "f88749ec-7931-4ee7-b3fc-1ec5e11f92e9"). InnerVolumeSpecName "kube-api-access-dbsvg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794974 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist" (OuterVolumeSpecName: "cni-sysctl-allowlist") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "cni-sysctl-allowlist". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.794998 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795231 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v" (OuterVolumeSpecName: "kube-api-access-pjr6v") pod "49ef4625-1d3a-4a9f-b595-c2433d32326d" (UID: "49ef4625-1d3a-4a9f-b595-c2433d32326d"). InnerVolumeSpecName "kube-api-access-pjr6v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795325 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert" (OuterVolumeSpecName: "srv-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "srv-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795787 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf" (OuterVolumeSpecName: "kube-api-access-v47cf") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "kube-api-access-v47cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795892 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.795951 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.796012 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config" (OuterVolumeSpecName: "config") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.797742 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates" (OuterVolumeSpecName: "available-featuregates") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "available-featuregates". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.798010 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52" (OuterVolumeSpecName: "kube-api-access-s4n52") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "kube-api-access-s4n52". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.799532 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.800111 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.800284 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.800343 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk" (OuterVolumeSpecName: "kube-api-access-rnphk") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "kube-api-access-rnphk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.802008 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.802413 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config" (OuterVolumeSpecName: "mcc-auth-proxy-config") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "mcc-auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.802458 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.802560 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls" (OuterVolumeSpecName: "machine-api-operator-tls") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "machine-api-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.802952 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.803340 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz" (OuterVolumeSpecName: "kube-api-access-8tdtz") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "kube-api-access-8tdtz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.803600 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:52.302394174 +0000 UTC m=+20.797415929 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.803637 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy" (OuterVolumeSpecName: "cni-binary-copy") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "cni-binary-copy". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.803955 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5" (OuterVolumeSpecName: "kube-api-access-zgdk5") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "kube-api-access-zgdk5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.803971 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6" (OuterVolumeSpecName: "kube-api-access-htfz6") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "kube-api-access-htfz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.804223 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh" (OuterVolumeSpecName: "kube-api-access-2w9zh") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "kube-api-access-2w9zh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.804240 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca" (OuterVolumeSpecName: "image-import-ca") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "image-import-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.804275 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config" (OuterVolumeSpecName: "config") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.804411 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config" (OuterVolumeSpecName: "encryption-config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "encryption-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.805511 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "0b574797-001e-440a-8f4e-c0be86edad0f" (UID: "0b574797-001e-440a-8f4e-c0be86edad0f"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.804432 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp" (OuterVolumeSpecName: "kube-api-access-qs4fp") pod "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" (UID: "210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c"). InnerVolumeSpecName "kube-api-access-qs4fp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.804737 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume" (OuterVolumeSpecName: "config-volume") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.805129 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images" (OuterVolumeSpecName: "images") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.805212 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config" (OuterVolumeSpecName: "auth-proxy-config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "auth-proxy-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.805903 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7" (OuterVolumeSpecName: "kube-api-access-9xfj7") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "kube-api-access-9xfj7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.806402 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz" (OuterVolumeSpecName: "kube-api-access-2d4wz") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "kube-api-access-2d4wz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.806690 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls" (OuterVolumeSpecName: "machine-approver-tls") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "machine-approver-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.807225 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.807574 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz" (OuterVolumeSpecName: "kube-api-access-bf2bz") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "kube-api-access-bf2bz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.807610 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh" (OuterVolumeSpecName: "kube-api-access-xcgwh") pod "fda69060-fa79-4696-b1a6-7980f124bf7c" (UID: "fda69060-fa79-4696-b1a6-7980f124bf7c"). InnerVolumeSpecName "kube-api-access-xcgwh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.807657 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808051 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7" (OuterVolumeSpecName: "kube-api-access-sb6h7") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "kube-api-access-sb6h7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808345 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808316 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808551 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "0b78653f-4ff9-4508-8672-245ed9b561e3" (UID: "0b78653f-4ff9-4508-8672-245ed9b561e3"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808500 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv" (OuterVolumeSpecName: "kube-api-access-zkvpv") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "kube-api-access-zkvpv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808800 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca" (OuterVolumeSpecName: "service-ca") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.808915 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz" (OuterVolumeSpecName: "kube-api-access-6g6sz") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "kube-api-access-6g6sz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.809548 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs" (OuterVolumeSpecName: "kube-api-access-pcxfs") pod "9d4552c7-cd75-42dd-8880-30dd377c49a4" (UID: "9d4552c7-cd75-42dd-8880-30dd377c49a4"). InnerVolumeSpecName "kube-api-access-pcxfs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.809741 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.809609 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc" (OuterVolumeSpecName: "kube-api-access-vt5rc") pod "44663579-783b-4372-86d6-acf235a62d72" (UID: "44663579-783b-4372-86d6-acf235a62d72"). InnerVolumeSpecName "kube-api-access-vt5rc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.809660 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config" (OuterVolumeSpecName: "config") pod "496e6271-fb68-4057-954e-a0d97a4afa3f" (UID: "496e6271-fb68-4057-954e-a0d97a4afa3f"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.810066 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:26:52.310041397 +0000 UTC m=+20.805063082 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810368 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7" (OuterVolumeSpecName: "kube-api-access-nzwt7") pod "96b93a3a-6083-4aea-8eab-fe1aa8245ad9" (UID: "96b93a3a-6083-4aea-8eab-fe1aa8245ad9"). InnerVolumeSpecName "kube-api-access-nzwt7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810335 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx" (OuterVolumeSpecName: "kube-api-access-d6qdx") pod "87cf06ed-a83f-41a7-828d-70653580a8cb" (UID: "87cf06ed-a83f-41a7-828d-70653580a8cb"). InnerVolumeSpecName "kube-api-access-d6qdx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810575 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810603 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810739 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m" (OuterVolumeSpecName: "kube-api-access-gf66m") pod "a0128f3a-b052-44ed-a84e-c4c8aaf17c13" (UID: "a0128f3a-b052-44ed-a84e-c4c8aaf17c13"). InnerVolumeSpecName "kube-api-access-gf66m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810751 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" (UID: "8cea82b4-6893-4ddc-af9f-1bb5ae425c5b"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810882 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810894 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810957 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.811069 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.811342 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert" (OuterVolumeSpecName: "profile-collector-cert") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "profile-collector-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.811428 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf" (OuterVolumeSpecName: "kube-api-access-7c4vf") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "kube-api-access-7c4vf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.811631 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" (UID: "308be0ea-9f5f-4b29-aeb1-5abd31a0b17b"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.811822 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert" (OuterVolumeSpecName: "package-server-manager-serving-cert") pod "3ab1a177-2de0-46d9-b765-d0d0649bb42e" (UID: "3ab1a177-2de0-46d9-b765-d0d0649bb42e"). InnerVolumeSpecName "package-server-manager-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.812149 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client" (OuterVolumeSpecName: "etcd-client") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-client". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.812335 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.812442 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.812524 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt" (OuterVolumeSpecName: "kube-api-access-fqsjt") pod "efdd0498-1daa-4136-9a4a-3b948c2293fc" (UID: "efdd0498-1daa-4136-9a4a-3b948c2293fc"). InnerVolumeSpecName "kube-api-access-fqsjt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.812686 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key" (OuterVolumeSpecName: "signing-key") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-key". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.813031 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config" (OuterVolumeSpecName: "config") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.813262 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-env-overrides\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.813735 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config" (OuterVolumeSpecName: "config") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.815658 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls" (OuterVolumeSpecName: "proxy-tls") pod "31d8b7a1-420e-4252-a5b7-eebe8a111292" (UID: "31d8b7a1-420e-4252-a5b7-eebe8a111292"). InnerVolumeSpecName "proxy-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.816352 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.811820 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.810229 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr" (OuterVolumeSpecName: "kube-api-access-249nr") pod "b6312bbd-5731-4ea0-a20f-81d5a57df44a" (UID: "b6312bbd-5731-4ea0-a20f-81d5a57df44a"). InnerVolumeSpecName "kube-api-access-249nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.817452 4812 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.818359 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.819111 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "925f1c65-6136-48ba-85aa-3a3b50560753" (UID: "925f1c65-6136-48ba-85aa-3a3b50560753"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.819290 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca" (OuterVolumeSpecName: "etcd-service-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.819469 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config" (OuterVolumeSpecName: "config") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.819930 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls" (OuterVolumeSpecName: "control-plane-machine-set-operator-tls") pod "6731426b-95fe-49ff-bb5f-40441049fde2" (UID: "6731426b-95fe-49ff-bb5f-40441049fde2"). InnerVolumeSpecName "control-plane-machine-set-operator-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.819785 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config" (OuterVolumeSpecName: "config") pod "22c825df-677d-4ca6-82db-3454ed06e783" (UID: "22c825df-677d-4ca6-82db-3454ed06e783"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.820230 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-identity-cm\" (UniqueName: \"kubernetes.io/configmap/ef543e1b-8068-4ea3-b32a-61027b32e95d-ovnkube-identity-cm\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.820240 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:52.320222229 +0000 UTC m=+20.815243904 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.820365 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd" (OuterVolumeSpecName: "kube-api-access-mnrrd") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "kube-api-access-mnrrd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.820447 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca" (OuterVolumeSpecName: "etcd-ca") pod "09efc573-dbb6-4249-bd59-9b87aba8dd28" (UID: "09efc573-dbb6-4249-bd59-9b87aba8dd28"). InnerVolumeSpecName "etcd-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.820586 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-iptables-alerter-script\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.821508 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.821516 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config" (OuterVolumeSpecName: "multus-daemon-config") pod "4bb40260-dbaa-4fb0-84df-5e680505d512" (UID: "4bb40260-dbaa-4fb0-84df-5e680505d512"). InnerVolumeSpecName "multus-daemon-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.821705 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.822419 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.822441 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.822787 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate" (OuterVolumeSpecName: "default-certificate") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "default-certificate". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.822862 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "49c341d1-5089-4bc2-86a0-a5e165cfcc6b" (UID: "49c341d1-5089-4bc2-86a0-a5e165cfcc6b"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.822866 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls" (OuterVolumeSpecName: "metrics-tls") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "metrics-tls". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.822900 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities" (OuterVolumeSpecName: "utilities") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.825591 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb" (OuterVolumeSpecName: "kube-api-access-279lb") pod "7bb08738-c794-4ee8-9972-3a62ca171029" (UID: "7bb08738-c794-4ee8-9972-3a62ca171029"). InnerVolumeSpecName "kube-api-access-279lb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.826017 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "bf126b07-da06-4140-9a57-dfd54fc6b486" (UID: "bf126b07-da06-4140-9a57-dfd54fc6b486"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.828194 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/ef543e1b-8068-4ea3-b32a-61027b32e95d-webhook-cert\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.830315 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp" (OuterVolumeSpecName: "kube-api-access-fcqwp") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "kube-api-access-fcqwp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.830754 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle" (OuterVolumeSpecName: "signing-cabundle") pod "25e176fe-21b4-4974-b1ed-c8b94f112a7f" (UID: "25e176fe-21b4-4974-b1ed-c8b94f112a7f"). InnerVolumeSpecName "signing-cabundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.833740 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs" (OuterVolumeSpecName: "certs") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.834003 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.834025 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.834037 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.834089 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:52.334072727 +0000 UTC m=+20.829094392 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.836148 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token" (OuterVolumeSpecName: "node-bootstrap-token") pod "5fe579f8-e8a6-4643-bce5-a661393c4dde" (UID: "5fe579f8-e8a6-4643-bce5-a661393c4dde"). InnerVolumeSpecName "node-bootstrap-token". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.837476 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert" (OuterVolumeSpecName: "cert") pod "20b0d48f-5fd6-431c-a545-e3c800c7b866" (UID: "20b0d48f-5fd6-431c-a545-e3c800c7b866"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.837724 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdwmf\" (UniqueName: \"kubernetes.io/projected/37a5e44f-9a88-4405-be8a-b645485e7312-kube-api-access-rdwmf\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.837988 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config" (OuterVolumeSpecName: "config") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.838343 4812 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.839983 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images" (OuterVolumeSpecName: "images") pod "6402fda4-df10-493c-b4e5-d0569419652d" (UID: "6402fda4-df10-493c-b4e5-d0569419652d"). InnerVolumeSpecName "images". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.840098 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct" (OuterVolumeSpecName: "kube-api-access-cfbct") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "kube-api-access-cfbct". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.840548 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.840676 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.840695 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.840706 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:51 crc kubenswrapper[4812]: E0131 04:26:51.840744 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:52.340732714 +0000 UTC m=+20.835754379 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.840735 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5" (OuterVolumeSpecName: "kube-api-access-qg5z5") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "kube-api-access-qg5z5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.841086 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config" (OuterVolumeSpecName: "console-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.841115 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2kz5\" (UniqueName: \"kubernetes.io/projected/ef543e1b-8068-4ea3-b32a-61027b32e95d-kube-api-access-s2kz5\") pod \"network-node-identity-vrzqb\" (UID: \"ef543e1b-8068-4ea3-b32a-61027b32e95d\") " pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.841101 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.841378 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/37a5e44f-9a88-4405-be8a-b645485e7312-metrics-tls\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.842640 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config" (OuterVolumeSpecName: "config") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.842919 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities" (OuterVolumeSpecName: "utilities") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.843057 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "43509403-f426-496e-be36-56cef71462f5" (UID: "43509403-f426-496e-be36-56cef71462f5"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.843460 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rczfb\" (UniqueName: \"kubernetes.io/projected/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-kube-api-access-rczfb\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.844698 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2" (OuterVolumeSpecName: "kube-api-access-jhbk2") pod "bd23aa5c-e532-4e53-bccf-e79f130c5ae8" (UID: "bd23aa5c-e532-4e53-bccf-e79f130c5ae8"). InnerVolumeSpecName "kube-api-access-jhbk2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.845757 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.846099 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e7e6199b-1264-4501-8953-767f51328d08" (UID: "e7e6199b-1264-4501-8953-767f51328d08"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.847071 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "b6cd30de-2eeb-49a2-ab40-9167f4560ff5" (UID: "b6cd30de-2eeb-49a2-ab40-9167f4560ff5"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.847470 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "c03ee662-fb2f-4fc4-a2c1-af487c19d254" (UID: "c03ee662-fb2f-4fc4-a2c1-af487c19d254"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.847650 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle" (OuterVolumeSpecName: "service-ca-bundle") pod "6509e943-70c6-444c-bc41-48a544e36fbd" (UID: "6509e943-70c6-444c-bc41-48a544e36fbd"). InnerVolumeSpecName "service-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.848272 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca" (OuterVolumeSpecName: "client-ca") pod "5441d097-087c-4d9a-baa8-b210afa90fc9" (UID: "5441d097-087c-4d9a-baa8-b210afa90fc9"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.848582 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca" (OuterVolumeSpecName: "serviceca") pod "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" (UID: "3cb93b32-e0ae-4377-b9c8-fdb9842c6d59"). InnerVolumeSpecName "serviceca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.848947 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config" (OuterVolumeSpecName: "config") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.849037 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config" (OuterVolumeSpecName: "config") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.851815 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "1d611f23-29be-4491-8495-bee1670e935f" (UID: "1d611f23-29be-4491-8495-bee1670e935f"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.852531 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.853168 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.854426 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl" (OuterVolumeSpecName: "kube-api-access-xcphl") pod "7583ce53-e0fe-4a16-9e4d-50516596a136" (UID: "7583ce53-e0fe-4a16-9e4d-50516596a136"). InnerVolumeSpecName "kube-api-access-xcphl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.854741 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "6ea678ab-3438-413e-bfe3-290ae7725660" (UID: "6ea678ab-3438-413e-bfe3-290ae7725660"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.855069 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a31745f5-9847-4afe-82a5-3161cc66ca93" (UID: "a31745f5-9847-4afe-82a5-3161cc66ca93"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.855083 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j" (OuterVolumeSpecName: "kube-api-access-w7l8j") pod "01ab3dd5-8196-46d0-ad33-122e2ca51def" (UID: "01ab3dd5-8196-46d0-ad33-122e2ca51def"). InnerVolumeSpecName "kube-api-access-w7l8j". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.855243 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities" (OuterVolumeSpecName: "utilities") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.855388 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "1386a44e-36a2-460c-96d0-0359d2b6f0f5" (UID: "1386a44e-36a2-460c-96d0-0359d2b6f0f5"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.855564 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca" (OuterVolumeSpecName: "etcd-serving-ca") pod "09ae3b1a-e8e7-4524-b54b-61eab6f9239a" (UID: "09ae3b1a-e8e7-4524-b54b-61eab6f9239a"). InnerVolumeSpecName "etcd-serving-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.858737 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" (UID: "b11524ee-3fca-4b1b-9cdf-6da289fdbc7d"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.858994 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.859952 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c" (OuterVolumeSpecName: "kube-api-access-tk88c") pod "7539238d-5fe0-46ed-884e-1c3b566537ec" (UID: "7539238d-5fe0-46ed-884e-1c3b566537ec"). InnerVolumeSpecName "kube-api-access-tk88c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.861095 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" (UID: "bc5039c0-ea34-426b-a2b7-fbbc87b49a6d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.862006 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config" (OuterVolumeSpecName: "config") pod "1bf7eb37-55a3-4c65-b768-a94c82151e69" (UID: "1bf7eb37-55a3-4c65-b768-a94c82151e69"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.864587 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.871781 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "57a731c4-ef35-47a8-b875-bfb08a7f8011" (UID: "57a731c4-ef35-47a8-b875-bfb08a7f8011"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.876162 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.885897 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.889603 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "5225d0e4-402f-4861-b410-819f433b1803" (UID: "5225d0e4-402f-4861-b410-819f433b1803"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.900561 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.900598 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.900716 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-etc-kube\" (UniqueName: \"kubernetes.io/host-path/37a5e44f-9a88-4405-be8a-b645485e7312-host-etc-kube\") pod \"network-operator-58b4c7f79c-55gtf\" (UID: \"37a5e44f-9a88-4405-be8a-b645485e7312\") " pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901065 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d75a4c96-2883-4a0b-bab2-0fab2b6c0b49-host-slash\") pod \"iptables-alerter-4ln5h\" (UID: \"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\") " pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901097 4812 reconciler_common.go:293] "Volume detached for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-default-certificate\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901111 4812 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901123 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcgwh\" (UniqueName: \"kubernetes.io/projected/fda69060-fa79-4696-b1a6-7980f124bf7c-kube-api-access-xcgwh\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901136 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-249nr\" (UniqueName: \"kubernetes.io/projected/b6312bbd-5731-4ea0-a20f-81d5a57df44a-kube-api-access-249nr\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901150 4812 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901162 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901173 4812 reconciler_common.go:293] "Volume detached for volume \"certs\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901184 4812 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/6402fda4-df10-493c-b4e5-d0569419652d-images\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901196 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901209 4812 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/a31745f5-9847-4afe-82a5-3161cc66ca93-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901220 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-279lb\" (UniqueName: \"kubernetes.io/projected/7bb08738-c794-4ee8-9972-3a62ca171029-kube-api-access-279lb\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901290 4812 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901330 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901346 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fqsjt\" (UniqueName: \"kubernetes.io/projected/efdd0498-1daa-4136-9a4a-3b948c2293fc-kube-api-access-fqsjt\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901361 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vt5rc\" (UniqueName: \"kubernetes.io/projected/44663579-783b-4372-86d6-acf235a62d72-kube-api-access-vt5rc\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901373 4812 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/87cf06ed-a83f-41a7-828d-70653580a8cb-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901385 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901396 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/e7e6199b-1264-4501-8953-767f51328d08-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901408 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901420 4812 reconciler_common.go:293] "Volume detached for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-profile-collector-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901434 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fcqwp\" (UniqueName: \"kubernetes.io/projected/5fe579f8-e8a6-4643-bce5-a661393c4dde-kube-api-access-fcqwp\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901446 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bf2bz\" (UniqueName: \"kubernetes.io/projected/1d611f23-29be-4491-8495-bee1670e935f-kube-api-access-bf2bz\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901458 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sb6h7\" (UniqueName: \"kubernetes.io/projected/1bf7eb37-55a3-4c65-b768-a94c82151e69-kube-api-access-sb6h7\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901469 4812 reconciler_common.go:293] "Volume detached for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-encryption-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901481 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/0b78653f-4ff9-4508-8672-245ed9b561e3-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901493 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7c4vf\" (UniqueName: \"kubernetes.io/projected/22c825df-677d-4ca6-82db-3454ed06e783-kube-api-access-7c4vf\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901504 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nzwt7\" (UniqueName: \"kubernetes.io/projected/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-kube-api-access-nzwt7\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901516 4812 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901528 4812 reconciler_common.go:293] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/20b0d48f-5fd6-431c-a545-e3c800c7b866-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901540 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901551 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901562 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1386a44e-36a2-460c-96d0-0359d2b6f0f5-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901575 4812 reconciler_common.go:293] "Volume detached for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-multus-daemon-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901587 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qg5z5\" (UniqueName: \"kubernetes.io/projected/43509403-f426-496e-be36-56cef71462f5-kube-api-access-qg5z5\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901599 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1bf7eb37-55a3-4c65-b768-a94c82151e69-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901610 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901622 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gf66m\" (UniqueName: \"kubernetes.io/projected/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-kube-api-access-gf66m\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901633 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901645 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pcxfs\" (UniqueName: \"kubernetes.io/projected/9d4552c7-cd75-42dd-8880-30dd377c49a4-kube-api-access-pcxfs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901659 4812 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c03ee662-fb2f-4fc4-a2c1-af487c19d254-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901671 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w7l8j\" (UniqueName: \"kubernetes.io/projected/01ab3dd5-8196-46d0-ad33-122e2ca51def-kube-api-access-w7l8j\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901682 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xcphl\" (UniqueName: \"kubernetes.io/projected/7583ce53-e0fe-4a16-9e4d-50516596a136-kube-api-access-xcphl\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901693 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/1386a44e-36a2-460c-96d0-0359d2b6f0f5-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901705 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901716 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901727 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-client\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901739 4812 reconciler_common.go:293] "Volume detached for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-service-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901750 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/5225d0e4-402f-4861-b410-819f433b1803-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901762 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-etcd-serving-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901773 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901785 4812 reconciler_common.go:293] "Volume detached for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/5fe579f8-e8a6-4643-bce5-a661393c4dde-node-bootstrap-token\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901797 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901810 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901823 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/31d8b7a1-420e-4252-a5b7-eebe8a111292-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901855 4812 reconciler_common.go:293] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-oauth-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901868 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cfbct\" (UniqueName: \"kubernetes.io/projected/57a731c4-ef35-47a8-b875-bfb08a7f8011-kube-api-access-cfbct\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901880 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901892 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lzf88\" (UniqueName: \"kubernetes.io/projected/0b574797-001e-440a-8f4e-c0be86edad0f-kube-api-access-lzf88\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901903 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901915 4812 reconciler_common.go:293] "Volume detached for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-audit\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901926 4812 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8f668bae-612b-4b75-9490-919e737c6a3b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901937 4812 reconciler_common.go:293] "Volume detached for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b-tmpfs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901949 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wxkg8\" (UniqueName: \"kubernetes.io/projected/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-kube-api-access-wxkg8\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901963 4812 reconciler_common.go:293] "Volume detached for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/fda69060-fa79-4696-b1a6-7980f124bf7c-mcd-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901975 4812 reconciler_common.go:293] "Volume detached for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/925f1c65-6136-48ba-85aa-3a3b50560753-ovn-control-plane-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901986 4812 reconciler_common.go:293] "Volume detached for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/a0128f3a-b052-44ed-a84e-c4c8aaf17c13-samples-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.901998 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1d611f23-29be-4491-8495-bee1670e935f-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902011 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902024 4812 reconciler_common.go:293] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902036 4812 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/925f1c65-6136-48ba-85aa-3a3b50560753-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902047 4812 reconciler_common.go:293] "Volume detached for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-etcd-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902059 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902070 4812 reconciler_common.go:293] "Volume detached for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/b6312bbd-5731-4ea0-a20f-81d5a57df44a-srv-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902081 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/0b78653f-4ff9-4508-8672-245ed9b561e3-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902092 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6g6sz\" (UniqueName: \"kubernetes.io/projected/6509e943-70c6-444c-bc41-48a544e36fbd-kube-api-access-6g6sz\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902104 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/496e6271-fb68-4057-954e-a0d97a4afa3f-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902115 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902126 4812 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902137 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902149 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnphk\" (UniqueName: \"kubernetes.io/projected/bf126b07-da06-4140-9a57-dfd54fc6b486-kube-api-access-rnphk\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902161 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v47cf\" (UniqueName: \"kubernetes.io/projected/c03ee662-fb2f-4fc4-a2c1-af487c19d254-kube-api-access-v47cf\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902173 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/496e6271-fb68-4057-954e-a0d97a4afa3f-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902184 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2w9zh\" (UniqueName: \"kubernetes.io/projected/4bb40260-dbaa-4fb0-84df-5e680505d512-kube-api-access-2w9zh\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902196 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902207 4812 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/6ea678ab-3438-413e-bfe3-290ae7725660-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902220 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902231 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902243 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/7583ce53-e0fe-4a16-9e4d-50516596a136-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902255 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x7zkh\" (UniqueName: \"kubernetes.io/projected/6731426b-95fe-49ff-bb5f-40441049fde2-kube-api-access-x7zkh\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902273 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902285 4812 reconciler_common.go:293] "Volume detached for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/6402fda4-df10-493c-b4e5-d0569419652d-machine-api-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902296 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/efdd0498-1daa-4136-9a4a-3b948c2293fc-webhook-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902318 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/5441d097-087c-4d9a-baa8-b210afa90fc9-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902329 4812 reconciler_common.go:293] "Volume detached for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/96b93a3a-6083-4aea-8eab-fe1aa8245ad9-metrics-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902340 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lz9wn\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-kube-api-access-lz9wn\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902352 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/fda69060-fa79-4696-b1a6-7980f124bf7c-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902363 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bf126b07-da06-4140-9a57-dfd54fc6b486-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902375 4812 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902386 4812 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902398 4812 reconciler_common.go:293] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-oauth-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902409 4812 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902421 4812 reconciler_common.go:293] "Volume detached for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-metrics-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902432 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09efc573-dbb6-4249-bd59-9b87aba8dd28-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902443 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d4lsv\" (UniqueName: \"kubernetes.io/projected/25e176fe-21b4-4974-b1ed-c8b94f112a7f-kube-api-access-d4lsv\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902455 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pjr6v\" (UniqueName: \"kubernetes.io/projected/49ef4625-1d3a-4a9f-b595-c2433d32326d-kube-api-access-pjr6v\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902466 4812 reconciler_common.go:293] "Volume detached for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/4bb40260-dbaa-4fb0-84df-5e680505d512-cni-binary-copy\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902478 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zgdk5\" (UniqueName: \"kubernetes.io/projected/31d8b7a1-420e-4252-a5b7-eebe8a111292-kube-api-access-zgdk5\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902489 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2d4wz\" (UniqueName: \"kubernetes.io/projected/5441d097-087c-4d9a-baa8-b210afa90fc9-kube-api-access-2d4wz\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902501 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a31745f5-9847-4afe-82a5-3161cc66ca93-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902512 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09efc573-dbb6-4249-bd59-9b87aba8dd28-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902523 4812 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902534 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9xfj7\" (UniqueName: \"kubernetes.io/projected/5225d0e4-402f-4861-b410-819f433b1803-kube-api-access-9xfj7\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902546 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8tdtz\" (UniqueName: \"kubernetes.io/projected/09efc573-dbb6-4249-bd59-9b87aba8dd28-kube-api-access-8tdtz\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902558 4812 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902568 4812 reconciler_common.go:293] "Volume detached for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/1bf7eb37-55a3-4c65-b768-a94c82151e69-image-import-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902580 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4d4hj\" (UniqueName: \"kubernetes.io/projected/3ab1a177-2de0-46d9-b765-d0d0649bb42e-kube-api-access-4d4hj\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902592 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/0b574797-001e-440a-8f4e-c0be86edad0f-proxy-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902603 4812 reconciler_common.go:293] "Volume detached for volume \"images\" (UniqueName: \"kubernetes.io/configmap/31d8b7a1-420e-4252-a5b7-eebe8a111292-images\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902614 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6509e943-70c6-444c-bc41-48a544e36fbd-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902625 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6509e943-70c6-444c-bc41-48a544e36fbd-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902637 4812 reconciler_common.go:293] "Volume detached for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-available-featuregates\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902651 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902662 4812 reconciler_common.go:293] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/43509403-f426-496e-be36-56cef71462f5-console-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902673 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s4n52\" (UniqueName: \"kubernetes.io/projected/925f1c65-6136-48ba-85aa-3a3b50560753-kube-api-access-s4n52\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902685 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/49c341d1-5089-4bc2-86a0-a5e165cfcc6b-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902697 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-htfz6\" (UniqueName: \"kubernetes.io/projected/6ea678ab-3438-413e-bfe3-290ae7725660-kube-api-access-htfz6\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902709 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/496e6271-fb68-4057-954e-a0d97a4afa3f-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902720 4812 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902732 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e7e6199b-1264-4501-8953-767f51328d08-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902743 4812 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/6ea678ab-3438-413e-bfe3-290ae7725660-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902755 4812 reconciler_common.go:293] "Volume detached for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/3ab1a177-2de0-46d9-b765-d0d0649bb42e-package-server-manager-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902766 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/01ab3dd5-8196-46d0-ad33-122e2ca51def-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902778 4812 reconciler_common.go:293] "Volume detached for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59-serviceca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902789 4812 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a31745f5-9847-4afe-82a5-3161cc66ca93-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902800 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/5441d097-087c-4d9a-baa8-b210afa90fc9-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902812 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tk88c\" (UniqueName: \"kubernetes.io/projected/7539238d-5fe0-46ed-884e-1c3b566537ec-kube-api-access-tk88c\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902823 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7539238d-5fe0-46ed-884e-1c3b566537ec-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902835 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jhbk2\" (UniqueName: \"kubernetes.io/projected/bd23aa5c-e532-4e53-bccf-e79f130c5ae8-kube-api-access-jhbk2\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902865 4812 reconciler_common.go:293] "Volume detached for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-key\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902875 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e7e6199b-1264-4501-8953-767f51328d08-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902904 4812 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/b6cd30de-2eeb-49a2-ab40-9167f4560ff5-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902916 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/57a731c4-ef35-47a8-b875-bfb08a7f8011-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902926 4812 reconciler_common.go:293] "Volume detached for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/c03ee662-fb2f-4fc4-a2c1-af487c19d254-stats-auth\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902938 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7539238d-5fe0-46ed-884e-1c3b566537ec-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902949 4812 reconciler_common.go:293] "Volume detached for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/7bb08738-c794-4ee8-9972-3a62ca171029-cni-sysctl-allowlist\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902961 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902972 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zkvpv\" (UniqueName: \"kubernetes.io/projected/09ae3b1a-e8e7-4524-b54b-61eab6f9239a-kube-api-access-zkvpv\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902984 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dbsvg\" (UniqueName: \"kubernetes.io/projected/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9-kube-api-access-dbsvg\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.902996 4812 reconciler_common.go:293] "Volume detached for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/22c825df-677d-4ca6-82db-3454ed06e783-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903007 4812 reconciler_common.go:293] "Volume detached for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/25e176fe-21b4-4974-b1ed-c8b94f112a7f-signing-cabundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903018 4812 reconciler_common.go:293] "Volume detached for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/22c825df-677d-4ca6-82db-3454ed06e783-machine-approver-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903030 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mnrrd\" (UniqueName: \"kubernetes.io/projected/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d-kube-api-access-mnrrd\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903041 4812 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8f668bae-612b-4b75-9490-919e737c6a3b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903055 4812 reconciler_common.go:293] "Volume detached for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/0b574797-001e-440a-8f4e-c0be86edad0f-mcc-auth-proxy-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903066 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8f668bae-612b-4b75-9490-919e737c6a3b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903077 4812 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8f668bae-612b-4b75-9490-919e737c6a3b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903088 4812 reconciler_common.go:293] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/43509403-f426-496e-be36-56cef71462f5-console-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903100 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4fp\" (UniqueName: \"kubernetes.io/projected/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c-kube-api-access-qs4fp\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903111 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7583ce53-e0fe-4a16-9e4d-50516596a136-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903122 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/01ab3dd5-8196-46d0-ad33-122e2ca51def-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903134 4812 reconciler_common.go:293] "Volume detached for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/6731426b-95fe-49ff-bb5f-40441049fde2-control-plane-machine-set-operator-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.903146 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6qdx\" (UniqueName: \"kubernetes.io/projected/87cf06ed-a83f-41a7-828d-70653580a8cb-kube-api-access-d6qdx\") on node \"crc\" DevicePath \"\"" Jan 31 04:26:51 crc kubenswrapper[4812]: I0131 04:26:51.991717 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.004295 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-node-identity/network-node-identity-vrzqb" Jan 31 04:26:52 crc kubenswrapper[4812]: W0131 04:26:52.004882 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod37a5e44f_9a88_4405_be8a_b645485e7312.slice/crio-31dea8a0dd39a92dadbf2986428fb8e0fa75109d5e858011aacfb80922568f51 WatchSource:0}: Error finding container 31dea8a0dd39a92dadbf2986428fb8e0fa75109d5e858011aacfb80922568f51: Status 404 returned error can't find the container with id 31dea8a0dd39a92dadbf2986428fb8e0fa75109d5e858011aacfb80922568f51 Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.016398 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-4ln5h" Jan 31 04:26:52 crc kubenswrapper[4812]: W0131 04:26:52.021124 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef543e1b_8068_4ea3_b32a_61027b32e95d.slice/crio-8a33a3a30c908db372bca1b3e7bc60f581489d09d8936eacf9d674d28bb8a460 WatchSource:0}: Error finding container 8a33a3a30c908db372bca1b3e7bc60f581489d09d8936eacf9d674d28bb8a460: Status 404 returned error can't find the container with id 8a33a3a30c908db372bca1b3e7bc60f581489d09d8936eacf9d674d28bb8a460 Jan 31 04:26:52 crc kubenswrapper[4812]: W0131 04:26:52.033185 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd75a4c96_2883_4a0b_bab2_0fab2b6c0b49.slice/crio-bcc39c0b4301a2fef7c2ef41c423ea0e446b4344979ba6d3b4ac0d70af13aef5 WatchSource:0}: Error finding container bcc39c0b4301a2fef7c2ef41c423ea0e446b4344979ba6d3b4ac0d70af13aef5: Status 404 returned error can't find the container with id bcc39c0b4301a2fef7c2ef41c423ea0e446b4344979ba6d3b4ac0d70af13aef5 Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.283633 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-25 02:00:55.113631134 +0000 UTC Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.306624 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.306721 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.306774 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:53.306760246 +0000 UTC m=+21.801781921 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.345899 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01ab3dd5-8196-46d0-ad33-122e2ca51def" path="/var/lib/kubelet/pods/01ab3dd5-8196-46d0-ad33-122e2ca51def/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.346737 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09ae3b1a-e8e7-4524-b54b-61eab6f9239a" path="/var/lib/kubelet/pods/09ae3b1a-e8e7-4524-b54b-61eab6f9239a/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.348014 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="09efc573-dbb6-4249-bd59-9b87aba8dd28" path="/var/lib/kubelet/pods/09efc573-dbb6-4249-bd59-9b87aba8dd28/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.348986 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b574797-001e-440a-8f4e-c0be86edad0f" path="/var/lib/kubelet/pods/0b574797-001e-440a-8f4e-c0be86edad0f/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.349786 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b78653f-4ff9-4508-8672-245ed9b561e3" path="/var/lib/kubelet/pods/0b78653f-4ff9-4508-8672-245ed9b561e3/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.350492 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1386a44e-36a2-460c-96d0-0359d2b6f0f5" path="/var/lib/kubelet/pods/1386a44e-36a2-460c-96d0-0359d2b6f0f5/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.351333 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1bf7eb37-55a3-4c65-b768-a94c82151e69" path="/var/lib/kubelet/pods/1bf7eb37-55a3-4c65-b768-a94c82151e69/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.352085 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d611f23-29be-4491-8495-bee1670e935f" path="/var/lib/kubelet/pods/1d611f23-29be-4491-8495-bee1670e935f/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.353011 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="20b0d48f-5fd6-431c-a545-e3c800c7b866" path="/var/lib/kubelet/pods/20b0d48f-5fd6-431c-a545-e3c800c7b866/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.353707 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c" path="/var/lib/kubelet/pods/210d8245-ebfc-4e3b-ac4a-e21ce76f9a7c/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.354287 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.354404 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22c825df-677d-4ca6-82db-3454ed06e783" path="/var/lib/kubelet/pods/22c825df-677d-4ca6-82db-3454ed06e783/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.356464 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25e176fe-21b4-4974-b1ed-c8b94f112a7f" path="/var/lib/kubelet/pods/25e176fe-21b4-4974-b1ed-c8b94f112a7f/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.357731 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="308be0ea-9f5f-4b29-aeb1-5abd31a0b17b" path="/var/lib/kubelet/pods/308be0ea-9f5f-4b29-aeb1-5abd31a0b17b/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.359789 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="31d8b7a1-420e-4252-a5b7-eebe8a111292" path="/var/lib/kubelet/pods/31d8b7a1-420e-4252-a5b7-eebe8a111292/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.361413 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ab1a177-2de0-46d9-b765-d0d0649bb42e" path="/var/lib/kubelet/pods/3ab1a177-2de0-46d9-b765-d0d0649bb42e/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.362890 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3cb93b32-e0ae-4377-b9c8-fdb9842c6d59" path="/var/lib/kubelet/pods/3cb93b32-e0ae-4377-b9c8-fdb9842c6d59/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.364677 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="43509403-f426-496e-be36-56cef71462f5" path="/var/lib/kubelet/pods/43509403-f426-496e-be36-56cef71462f5/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.365767 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="44663579-783b-4372-86d6-acf235a62d72" path="/var/lib/kubelet/pods/44663579-783b-4372-86d6-acf235a62d72/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.367385 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="496e6271-fb68-4057-954e-a0d97a4afa3f" path="/var/lib/kubelet/pods/496e6271-fb68-4057-954e-a0d97a4afa3f/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.368915 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49c341d1-5089-4bc2-86a0-a5e165cfcc6b" path="/var/lib/kubelet/pods/49c341d1-5089-4bc2-86a0-a5e165cfcc6b/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.371240 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="49ef4625-1d3a-4a9f-b595-c2433d32326d" path="/var/lib/kubelet/pods/49ef4625-1d3a-4a9f-b595-c2433d32326d/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.372478 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4bb40260-dbaa-4fb0-84df-5e680505d512" path="/var/lib/kubelet/pods/4bb40260-dbaa-4fb0-84df-5e680505d512/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.372716 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.373478 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5225d0e4-402f-4861-b410-819f433b1803" path="/var/lib/kubelet/pods/5225d0e4-402f-4861-b410-819f433b1803/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.375740 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5441d097-087c-4d9a-baa8-b210afa90fc9" path="/var/lib/kubelet/pods/5441d097-087c-4d9a-baa8-b210afa90fc9/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.376780 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57a731c4-ef35-47a8-b875-bfb08a7f8011" path="/var/lib/kubelet/pods/57a731c4-ef35-47a8-b875-bfb08a7f8011/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.378399 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5b88f790-22fa-440e-b583-365168c0b23d" path="/var/lib/kubelet/pods/5b88f790-22fa-440e-b583-365168c0b23d/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.380013 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5fe579f8-e8a6-4643-bce5-a661393c4dde" path="/var/lib/kubelet/pods/5fe579f8-e8a6-4643-bce5-a661393c4dde/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.381263 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6402fda4-df10-493c-b4e5-d0569419652d" path="/var/lib/kubelet/pods/6402fda4-df10-493c-b4e5-d0569419652d/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.382761 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6509e943-70c6-444c-bc41-48a544e36fbd" path="/var/lib/kubelet/pods/6509e943-70c6-444c-bc41-48a544e36fbd/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.384020 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6731426b-95fe-49ff-bb5f-40441049fde2" path="/var/lib/kubelet/pods/6731426b-95fe-49ff-bb5f-40441049fde2/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.386147 4812 kubelet_volumes.go:152] "Cleaned up orphaned volume subpath from pod" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volume-subpaths/run-systemd/ovnkube-controller/6" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.386347 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ea678ab-3438-413e-bfe3-290ae7725660" path="/var/lib/kubelet/pods/6ea678ab-3438-413e-bfe3-290ae7725660/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.388340 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7539238d-5fe0-46ed-884e-1c3b566537ec" path="/var/lib/kubelet/pods/7539238d-5fe0-46ed-884e-1c3b566537ec/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.388321 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.389187 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7583ce53-e0fe-4a16-9e4d-50516596a136" path="/var/lib/kubelet/pods/7583ce53-e0fe-4a16-9e4d-50516596a136/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.389705 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb08738-c794-4ee8-9972-3a62ca171029" path="/var/lib/kubelet/pods/7bb08738-c794-4ee8-9972-3a62ca171029/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.391137 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87cf06ed-a83f-41a7-828d-70653580a8cb" path="/var/lib/kubelet/pods/87cf06ed-a83f-41a7-828d-70653580a8cb/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.391943 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8cea82b4-6893-4ddc-af9f-1bb5ae425c5b" path="/var/lib/kubelet/pods/8cea82b4-6893-4ddc-af9f-1bb5ae425c5b/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.392633 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="925f1c65-6136-48ba-85aa-3a3b50560753" path="/var/lib/kubelet/pods/925f1c65-6136-48ba-85aa-3a3b50560753/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.393495 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96b93a3a-6083-4aea-8eab-fe1aa8245ad9" path="/var/lib/kubelet/pods/96b93a3a-6083-4aea-8eab-fe1aa8245ad9/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.394349 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9d4552c7-cd75-42dd-8880-30dd377c49a4" path="/var/lib/kubelet/pods/9d4552c7-cd75-42dd-8880-30dd377c49a4/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.394994 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a0128f3a-b052-44ed-a84e-c4c8aaf17c13" path="/var/lib/kubelet/pods/a0128f3a-b052-44ed-a84e-c4c8aaf17c13/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.395729 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a31745f5-9847-4afe-82a5-3161cc66ca93" path="/var/lib/kubelet/pods/a31745f5-9847-4afe-82a5-3161cc66ca93/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.396507 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b11524ee-3fca-4b1b-9cdf-6da289fdbc7d" path="/var/lib/kubelet/pods/b11524ee-3fca-4b1b-9cdf-6da289fdbc7d/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.398226 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6312bbd-5731-4ea0-a20f-81d5a57df44a" path="/var/lib/kubelet/pods/b6312bbd-5731-4ea0-a20f-81d5a57df44a/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.399245 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b6cd30de-2eeb-49a2-ab40-9167f4560ff5" path="/var/lib/kubelet/pods/b6cd30de-2eeb-49a2-ab40-9167f4560ff5/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.401134 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc5039c0-ea34-426b-a2b7-fbbc87b49a6d" path="/var/lib/kubelet/pods/bc5039c0-ea34-426b-a2b7-fbbc87b49a6d/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.402275 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd23aa5c-e532-4e53-bccf-e79f130c5ae8" path="/var/lib/kubelet/pods/bd23aa5c-e532-4e53-bccf-e79f130c5ae8/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.404549 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf126b07-da06-4140-9a57-dfd54fc6b486" path="/var/lib/kubelet/pods/bf126b07-da06-4140-9a57-dfd54fc6b486/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.404515 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.405693 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c03ee662-fb2f-4fc4-a2c1-af487c19d254" path="/var/lib/kubelet/pods/c03ee662-fb2f-4fc4-a2c1-af487c19d254/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.406946 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d" path="/var/lib/kubelet/pods/cd70aa09-68dd-4d64-bd6f-156fe6d1dc6d/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.407293 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.407387 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.407492 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407525 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:26:53.407493186 +0000 UTC m=+21.902514881 (durationBeforeRetry 1s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.407574 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407654 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407697 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407723 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:53.407702112 +0000 UTC m=+21.902723817 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407737 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407767 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407790 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407812 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407832 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407886 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:53.407822565 +0000 UTC m=+21.902844300 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:52 crc kubenswrapper[4812]: E0131 04:26:52.407917 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:53.407902007 +0000 UTC m=+21.902923712 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.408759 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7e6199b-1264-4501-8953-767f51328d08" path="/var/lib/kubelet/pods/e7e6199b-1264-4501-8953-767f51328d08/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.410026 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efdd0498-1daa-4136-9a4a-3b948c2293fc" path="/var/lib/kubelet/pods/efdd0498-1daa-4136-9a4a-3b948c2293fc/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.411356 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f88749ec-7931-4ee7-b3fc-1ec5e11f92e9" path="/var/lib/kubelet/pods/f88749ec-7931-4ee7-b3fc-1ec5e11f92e9/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.413410 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fda69060-fa79-4696-b1a6-7980f124bf7c" path="/var/lib/kubelet/pods/fda69060-fa79-4696-b1a6-7980f124bf7c/volumes" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.422268 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [webhook approver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":false,\\\"restartCount\\\":6,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.435432 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.451341 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.466230 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"bcc39c0b4301a2fef7c2ef41c423ea0e446b4344979ba6d3b4ac0d70af13aef5"} Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.468511 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a"} Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.468573 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a"} Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.468592 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" event={"ID":"ef543e1b-8068-4ea3-b32a-61027b32e95d","Type":"ContainerStarted","Data":"8a33a3a30c908db372bca1b3e7bc60f581489d09d8936eacf9d674d28bb8a460"} Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.471379 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de"} Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.471424 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" event={"ID":"37a5e44f-9a88-4405-be8a-b645485e7312","Type":"ContainerStarted","Data":"31dea8a0dd39a92dadbf2986428fb8e0fa75109d5e858011aacfb80922568f51"} Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.486710 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.503059 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.517035 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.531783 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.549977 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-operator]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":false,\\\"restartCount\\\":5,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.582816 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.606966 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.630625 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.642858 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.652766 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.663169 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.673237 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.687238 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:52 crc kubenswrapper[4812]: I0131 04:26:52.706703 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": dial tcp 127.0.0.1:9743: connect: connection refused" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.284268 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 09:09:49.091515772 +0000 UTC Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.315141 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.315298 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.315423 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:55.315395036 +0000 UTC m=+23.810416731 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.338646 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.338676 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.338676 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.338822 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.338977 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.339166 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.415744 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.415890 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.415952 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.415990 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416039 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:26:55.415994293 +0000 UTC m=+23.911015988 (durationBeforeRetry 2s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416080 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416111 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416124 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416156 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416168 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416223 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416182 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:55.416164678 +0000 UTC m=+23.911186343 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416243 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416263 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:55.41624178 +0000 UTC m=+23.911263575 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:53 crc kubenswrapper[4812]: E0131 04:26:53.416309 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:55.416286671 +0000 UTC m=+23.911308426 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.614936 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.621691 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.632506 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/kube-controller-manager-crc"] Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.641067 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.659326 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.678076 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.692861 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.706359 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.721650 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.737133 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.754193 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.769077 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.781351 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.800735 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.812696 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.825752 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.840062 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:53 crc kubenswrapper[4812]: I0131 04:26:53.854679 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:53Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.285160 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 13:28:47.861947246 +0000 UTC Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.832148 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-etcd/etcd-crc" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.847936 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.850457 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd/etcd-crc"] Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.850572 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-etcd/etcd-crc" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.864008 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.881981 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.897655 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.913140 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.939801 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.960653 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:54 crc kubenswrapper[4812]: I0131 04:26:54.975488 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:54Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.007822 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.024929 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.044314 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.060059 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.079711 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.101422 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.125414 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.145714 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.161638 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.285551 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-05 10:10:32.260116535 +0000 UTC Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.331981 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.332106 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.332169 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:59.332151624 +0000 UTC m=+27.827173299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.339027 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.339096 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.339172 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.339229 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.339480 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.339699 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.368035 4812 csr.go:261] certificate signing request csr-jmkwl is approved, waiting to be issued Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.383240 4812 csr.go:257] certificate signing request csr-jmkwl is issued Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.407592 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-h7gqd"] Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.408124 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.408226 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/node-ca-kctmd"] Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.408666 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.410198 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.410255 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: W0131 04:26:55.410255 4812 reflector.go:561] object-"openshift-image-registry"/"node-ca-dockercfg-4777p": failed to list *v1.Secret: secrets "node-ca-dockercfg-4777p" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-image-registry": no relationship found between node 'crc' and this object Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.410349 4812 reflector.go:158] "Unhandled Error" err="object-\"openshift-image-registry\"/\"node-ca-dockercfg-4777p\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-ca-dockercfg-4777p\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-image-registry\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.410541 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 04:26:55 crc kubenswrapper[4812]: W0131 04:26:55.410779 4812 reflector.go:561] object-"openshift-dns"/"node-resolver-dockercfg-kz9s7": failed to list *v1.Secret: secrets "node-resolver-dockercfg-kz9s7" is forbidden: User "system:node:crc" cannot list resource "secrets" in API group "" in the namespace "openshift-dns": no relationship found between node 'crc' and this object Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.410810 4812 reflector.go:158] "Unhandled Error" err="object-\"openshift-dns\"/\"node-resolver-dockercfg-kz9s7\": Failed to watch *v1.Secret: failed to list *v1.Secret: secrets \"node-resolver-dockercfg-kz9s7\" is forbidden: User \"system:node:crc\" cannot list resource \"secrets\" in API group \"\" in the namespace \"openshift-dns\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.411076 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.411260 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433094 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433177 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2b2af11-2df5-49c5-92e2-3965de954bb2-host\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433204 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3488c03b-583a-49f2-818a-0b2d55648e51-hosts-file\") pod \"node-resolver-h7gqd\" (UID: \"3488c03b-583a-49f2-818a-0b2d55648e51\") " pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433238 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433270 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433296 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433319 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2b2af11-2df5-49c5-92e2-3965de954bb2-serviceca\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433341 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgj4m\" (UniqueName: \"kubernetes.io/projected/3488c03b-583a-49f2-818a-0b2d55648e51-kube-api-access-fgj4m\") pod \"node-resolver-h7gqd\" (UID: \"3488c03b-583a-49f2-818a-0b2d55648e51\") " pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.433360 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fx7l4\" (UniqueName: \"kubernetes.io/projected/e2b2af11-2df5-49c5-92e2-3965de954bb2-kube-api-access-fx7l4\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.433478 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:26:59.43346027 +0000 UTC m=+27.928481935 (durationBeforeRetry 4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.433601 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.433615 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.433627 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.433676 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:59.433667765 +0000 UTC m=+27.928689440 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.434052 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.434059 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.434118 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.434128 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.434106 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:59.434095577 +0000 UTC m=+27.929117332 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:55 crc kubenswrapper[4812]: E0131 04:26:55.434168 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:26:59.434157409 +0000 UTC m=+27.929179164 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.441250 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.463894 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.475230 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.479096 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" event={"ID":"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49","Type":"ContainerStarted","Data":"d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a"} Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.489187 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [iptables-alerter]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.507435 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.526759 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534434 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2b2af11-2df5-49c5-92e2-3965de954bb2-serviceca\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534460 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fx7l4\" (UniqueName: \"kubernetes.io/projected/e2b2af11-2df5-49c5-92e2-3965de954bb2-kube-api-access-fx7l4\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534479 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fgj4m\" (UniqueName: \"kubernetes.io/projected/3488c03b-583a-49f2-818a-0b2d55648e51-kube-api-access-fgj4m\") pod \"node-resolver-h7gqd\" (UID: \"3488c03b-583a-49f2-818a-0b2d55648e51\") " pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534495 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2b2af11-2df5-49c5-92e2-3965de954bb2-host\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534514 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3488c03b-583a-49f2-818a-0b2d55648e51-hosts-file\") pod \"node-resolver-h7gqd\" (UID: \"3488c03b-583a-49f2-818a-0b2d55648e51\") " pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534571 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/3488c03b-583a-49f2-818a-0b2d55648e51-hosts-file\") pod \"node-resolver-h7gqd\" (UID: \"3488c03b-583a-49f2-818a-0b2d55648e51\") " pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.534938 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e2b2af11-2df5-49c5-92e2-3965de954bb2-host\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.535367 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/e2b2af11-2df5-49c5-92e2-3965de954bb2-serviceca\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.540455 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.557178 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.557390 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgj4m\" (UniqueName: \"kubernetes.io/projected/3488c03b-583a-49f2-818a-0b2d55648e51-kube-api-access-fgj4m\") pod \"node-resolver-h7gqd\" (UID: \"3488c03b-583a-49f2-818a-0b2d55648e51\") " pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.557458 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fx7l4\" (UniqueName: \"kubernetes.io/projected/e2b2af11-2df5-49c5-92e2-3965de954bb2-kube-api-access-fx7l4\") pod \"node-ca-kctmd\" (UID: \"e2b2af11-2df5-49c5-92e2-3965de954bb2\") " pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.570405 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.581247 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.595347 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.612503 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.635538 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.661053 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.675413 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.687129 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.697792 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.710361 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.723326 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.732607 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.753905 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.848972 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-pnwcx"] Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.849574 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-daemon-lx2wb"] Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.849697 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.850362 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-additional-cni-plugins-2vzj6"] Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.850459 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.851384 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.853730 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.853865 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.853945 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854192 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854196 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854236 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854265 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854198 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854466 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854494 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.854704 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.855128 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.873224 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.885700 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.898391 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.909645 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.933687 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936312 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5vp29\" (UniqueName: \"kubernetes.io/projected/258de1b0-7f55-45cb-9ce9-57366ae91c94-kube-api-access-5vp29\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936353 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-netns\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936380 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6050f642-2492-4f83-a739-ac905c409b8c-multus-daemon-config\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936405 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-96b4n\" (UniqueName: \"kubernetes.io/projected/6050f642-2492-4f83-a739-ac905c409b8c-kube-api-access-96b4n\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936452 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936475 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-cnibin\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936496 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-kubelet\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936520 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-cni-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936562 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-system-cni-dir\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936584 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-os-release\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936638 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-cni-bin\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936686 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2l7vl\" (UniqueName: \"kubernetes.io/projected/62392df6-29ca-4dfc-b3ab-db13388a43a6-kube-api-access-2l7vl\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936708 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-k8s-cni-cncf-io\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936727 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-conf-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936757 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-os-release\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936775 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62392df6-29ca-4dfc-b3ab-db13388a43a6-mcd-auth-proxy-config\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936814 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-system-cni-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936858 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/62392df6-29ca-4dfc-b3ab-db13388a43a6-rootfs\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936886 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-cnibin\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936905 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-socket-dir-parent\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936925 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-hostroot\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936944 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-multus-certs\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936962 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6050f642-2492-4f83-a739-ac905c409b8c-cni-binary-copy\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.936983 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-cni-multus\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.937001 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-etc-kubernetes\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.937021 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/62392df6-29ca-4dfc-b3ab-db13388a43a6-proxy-tls\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.937063 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/258de1b0-7f55-45cb-9ce9-57366ae91c94-cni-binary-copy\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.937088 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/258de1b0-7f55-45cb-9ce9-57366ae91c94-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.948084 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.960655 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.978029 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:55 crc kubenswrapper[4812]: I0131 04:26:55.993462 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:55Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038177 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-multus-certs\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038212 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6050f642-2492-4f83-a739-ac905c409b8c-cni-binary-copy\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038231 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-cni-multus\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038246 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-etc-kubernetes\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038262 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/62392df6-29ca-4dfc-b3ab-db13388a43a6-proxy-tls\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038294 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/258de1b0-7f55-45cb-9ce9-57366ae91c94-cni-binary-copy\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038309 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/258de1b0-7f55-45cb-9ce9-57366ae91c94-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038326 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5vp29\" (UniqueName: \"kubernetes.io/projected/258de1b0-7f55-45cb-9ce9-57366ae91c94-kube-api-access-5vp29\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038342 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-netns\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038358 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6050f642-2492-4f83-a739-ac905c409b8c-multus-daemon-config\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038373 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-96b4n\" (UniqueName: \"kubernetes.io/projected/6050f642-2492-4f83-a739-ac905c409b8c-kube-api-access-96b4n\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038388 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038402 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-cnibin\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038416 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-kubelet\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038433 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-cni-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038461 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-system-cni-dir\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038475 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-os-release\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038489 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-cni-bin\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038503 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2l7vl\" (UniqueName: \"kubernetes.io/projected/62392df6-29ca-4dfc-b3ab-db13388a43a6-kube-api-access-2l7vl\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038519 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-k8s-cni-cncf-io\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038535 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-conf-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038550 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-os-release\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038564 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62392df6-29ca-4dfc-b3ab-db13388a43a6-mcd-auth-proxy-config\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038579 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-system-cni-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038592 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/62392df6-29ca-4dfc-b3ab-db13388a43a6-rootfs\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038607 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-cnibin\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038620 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-socket-dir-parent\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038634 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-hostroot\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.038689 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-multus-certs\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039201 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-cni-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039260 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/6050f642-2492-4f83-a739-ac905c409b8c-cni-binary-copy\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039807 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcd-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/62392df6-29ca-4dfc-b3ab-db13388a43a6-mcd-auth-proxy-config\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039337 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-cni-bin\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039388 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-k8s-cni-cncf-io\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039300 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-system-cni-dir\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039431 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-os-release\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039432 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-os-release\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039444 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-conf-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039442 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-system-cni-dir\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039471 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-cni-multus\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039472 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-multus-socket-dir-parent\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039501 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-hostroot\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039516 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-run-netns\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039517 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-cnibin\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039525 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-host-var-lib-kubelet\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039547 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/6050f642-2492-4f83-a739-ac905c409b8c-etc-kubernetes\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039740 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-cnibin\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.039397 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rootfs\" (UniqueName: \"kubernetes.io/host-path/62392df6-29ca-4dfc-b3ab-db13388a43a6-rootfs\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.040319 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/258de1b0-7f55-45cb-9ce9-57366ae91c94-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.040385 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/258de1b0-7f55-45cb-9ce9-57366ae91c94-cni-binary-copy\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.040715 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/6050f642-2492-4f83-a739-ac905c409b8c-multus-daemon-config\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.044583 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/62392df6-29ca-4dfc-b3ab-db13388a43a6-proxy-tls\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.046508 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/258de1b0-7f55-45cb-9ce9-57366ae91c94-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.061318 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5vp29\" (UniqueName: \"kubernetes.io/projected/258de1b0-7f55-45cb-9ce9-57366ae91c94-kube-api-access-5vp29\") pod \"multus-additional-cni-plugins-2vzj6\" (UID: \"258de1b0-7f55-45cb-9ce9-57366ae91c94\") " pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.066398 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2l7vl\" (UniqueName: \"kubernetes.io/projected/62392df6-29ca-4dfc-b3ab-db13388a43a6-kube-api-access-2l7vl\") pod \"machine-config-daemon-lx2wb\" (UID: \"62392df6-29ca-4dfc-b3ab-db13388a43a6\") " pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.072366 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-96b4n\" (UniqueName: \"kubernetes.io/projected/6050f642-2492-4f83-a739-ac905c409b8c-kube-api-access-96b4n\") pod \"multus-pnwcx\" (UID: \"6050f642-2492-4f83-a739-ac905c409b8c\") " pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.102258 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.139060 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.161870 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.165223 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-pnwcx" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.172918 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.177158 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.177362 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" Jan 31 04:26:56 crc kubenswrapper[4812]: W0131 04:26:56.193776 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod258de1b0_7f55_45cb_9ce9_57366ae91c94.slice/crio-dd4d445481e39171cb3fa72986fd2d9ff7558a22124cacff6880f848d13c719b WatchSource:0}: Error finding container dd4d445481e39171cb3fa72986fd2d9ff7558a22124cacff6880f848d13c719b: Status 404 returned error can't find the container with id dd4d445481e39171cb3fa72986fd2d9ff7558a22124cacff6880f848d13c719b Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.193867 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.220568 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.240290 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.250203 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.268628 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.280742 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.286558 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 08:11:53.323549803 +0000 UTC Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.298101 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.309208 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.323682 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.340208 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.357098 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.371454 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.383373 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.384389 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Certificate expiration is 2027-01-31 04:21:55 +0000 UTC, rotation deadline is 2026-11-19 21:54:07.253393382 +0000 UTC Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.384469 4812 certificate_manager.go:356] kubernetes.io/kube-apiserver-client-kubelet: Waiting 7025h27m10.868929345s for next certificate rotation Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.486366 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerStarted","Data":"ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.486413 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerStarted","Data":"dd4d445481e39171cb3fa72986fd2d9ff7558a22124cacff6880f848d13c719b"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.488336 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.488362 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.488373 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"7555c99c7112fed9f2e18030dee2fbdc7a5e145c9c198bf5563ce8cec014dfde"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.490121 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerStarted","Data":"d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.490191 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerStarted","Data":"31259604aeb3f4841fa041c091b97f35c61cb2bd060d5e500f444d81d7cf2949"} Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.508670 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.527137 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.539553 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.561239 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.572060 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.584390 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.598137 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.600983 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2f9"] Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.602012 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.603756 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.604712 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.605564 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.606305 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.606537 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.606634 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.607557 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.611771 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.626272 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.638560 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643496 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-etc-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643535 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-script-lib\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643554 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-kubelet\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643571 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-log-socket\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643585 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-config\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643607 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-env-overrides\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643634 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-netd\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643736 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-bin\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643797 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-systemd-units\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643816 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-slash\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643893 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-systemd\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.643937 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvm2v\" (UniqueName: \"kubernetes.io/projected/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-kube-api-access-cvm2v\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644037 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-var-lib-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644103 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644144 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-node-log\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644165 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644187 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-ovn\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644210 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-netns\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644225 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.644246 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovn-node-metrics-cert\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.648469 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.659260 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [machine-config-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.671524 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.683240 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.697250 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.708470 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.720942 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.721929 4812 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-dns/node-resolver-h7gqd" secret="" err="failed to sync secret cache: timed out waiting for the condition" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.721988 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-h7gqd" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.729077 4812 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="openshift-image-registry/node-ca-kctmd" secret="" err="failed to sync secret cache: timed out waiting for the condition" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.729127 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-kctmd" Jan 31 04:26:56 crc kubenswrapper[4812]: W0131 04:26:56.735289 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3488c03b_583a_49f2_818a_0b2d55648e51.slice/crio-8ce60bc7c574c8a5f78ce341d16cefb3085e6593f9d49f3f38450444def8cfea WatchSource:0}: Error finding container 8ce60bc7c574c8a5f78ce341d16cefb3085e6593f9d49f3f38450444def8cfea: Status 404 returned error can't find the container with id 8ce60bc7c574c8a5f78ce341d16cefb3085e6593f9d49f3f38450444def8cfea Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745088 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745158 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovn-node-metrics-cert\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745226 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-etc-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745260 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-script-lib\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745292 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-kubelet\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745320 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-env-overrides\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745354 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-log-socket\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745383 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-config\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745412 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-netd\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745467 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-bin\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745504 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-systemd-units\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745531 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-slash\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745560 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-systemd\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745589 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cvm2v\" (UniqueName: \"kubernetes.io/projected/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-kube-api-access-cvm2v\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745603 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-kubelet\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745622 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-var-lib-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745652 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745672 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-etc-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745713 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-node-log\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745745 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745780 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-netns\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745807 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-ovn\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.745961 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-ovn\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746321 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746423 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-systemd\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746480 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-log-socket\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746714 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-bin\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746739 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-systemd-units\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746774 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-ovn-kubernetes\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746815 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-var-lib-openvswitch\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746822 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-env-overrides\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746877 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746813 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-netd\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746903 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-node-log\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746864 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-netns\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746823 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-slash\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.746891 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-script-lib\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: W0131 04:26:56.748668 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode2b2af11_2df5_49c5_92e2_3965de954bb2.slice/crio-da85e7ac36a3577cee592154c3f13a93a8fd83ac55025a36b499d9476df05cd0 WatchSource:0}: Error finding container da85e7ac36a3577cee592154c3f13a93a8fd83ac55025a36b499d9476df05cd0: Status 404 returned error can't find the container with id da85e7ac36a3577cee592154c3f13a93a8fd83ac55025a36b499d9476df05cd0 Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.749384 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-config\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.751338 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovn-node-metrics-cert\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.763136 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.789053 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvm2v\" (UniqueName: \"kubernetes.io/projected/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-kube-api-access-cvm2v\") pod \"ovnkube-node-bl2f9\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.825264 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.864040 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.875265 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.916490 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:26:56 crc kubenswrapper[4812]: W0131 04:26:56.927123 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd92c2935_9600_4e3b_b6ef_2be7b6d9ef7a.slice/crio-33845ad6b8799d85829540bc9956f770d623a8e21cddb107e3c3705d4ffc7930 WatchSource:0}: Error finding container 33845ad6b8799d85829540bc9956f770d623a8e21cddb107e3c3705d4ffc7930: Status 404 returned error can't find the container with id 33845ad6b8799d85829540bc9956f770d623a8e21cddb107e3c3705d4ffc7930 Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.938498 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.960174 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:56Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:56 crc kubenswrapper[4812]: I0131 04:26:56.994484 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.021732 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.067864 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with incomplete status: [kubecfg-setup]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.102688 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.144483 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.182500 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.226405 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.265756 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.286918 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-17 23:28:58.052448215 +0000 UTC Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.339383 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:57 crc kubenswrapper[4812]: E0131 04:26:57.339831 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.339457 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:57 crc kubenswrapper[4812]: E0131 04:26:57.339955 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.339430 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:57 crc kubenswrapper[4812]: E0131 04:26:57.340372 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.500612 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f" exitCode=0 Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.500732 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.500796 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"33845ad6b8799d85829540bc9956f770d623a8e21cddb107e3c3705d4ffc7930"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.503994 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kctmd" event={"ID":"e2b2af11-2df5-49c5-92e2-3965de954bb2","Type":"ContainerStarted","Data":"d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.504031 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-kctmd" event={"ID":"e2b2af11-2df5-49c5-92e2-3965de954bb2","Type":"ContainerStarted","Data":"da85e7ac36a3577cee592154c3f13a93a8fd83ac55025a36b499d9476df05cd0"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.505854 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h7gqd" event={"ID":"3488c03b-583a-49f2-818a-0b2d55648e51","Type":"ContainerStarted","Data":"f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.505914 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-h7gqd" event={"ID":"3488c03b-583a-49f2-818a-0b2d55648e51","Type":"ContainerStarted","Data":"8ce60bc7c574c8a5f78ce341d16cefb3085e6593f9d49f3f38450444def8cfea"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.511618 4812 generic.go:334] "Generic (PLEG): container finished" podID="258de1b0-7f55-45cb-9ce9-57366ae91c94" containerID="ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82" exitCode=0 Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.511673 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerDied","Data":"ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82"} Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.528006 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.591862 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.607732 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.617323 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [node-ca]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.630085 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.643124 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.657457 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [egress-router-binary-copy cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.671884 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.686387 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.697223 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.711252 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.742526 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.784471 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [dns-node-resolver]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.828019 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.865547 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.908466 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.949363 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:57 crc kubenswrapper[4812]: I0131 04:26:57.991866 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:57Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.022378 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.060892 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.087041 4812 kubelet_node_status.go:401] "Setting node annotation to enable volume controller attach/detach" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.088916 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.088950 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.088963 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.089084 4812 kubelet_node_status.go:76] "Attempting to register node" node="crc" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.110293 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.154757 4812 kubelet_node_status.go:115] "Node was previously registered" node="crc" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.155027 4812 kubelet_node_status.go:79] "Successfully registered node" node="crc" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.156176 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.156211 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.156220 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.156234 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.156243 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: E0131 04:26:58.169862 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.173306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.173336 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.173344 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.173359 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.173371 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: E0131 04:26:58.186982 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.187774 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.190974 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.191009 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.191019 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.191032 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.191042 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: E0131 04:26:58.202017 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.205008 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.205032 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.205042 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.205056 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.205066 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: E0131 04:26:58.219093 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.221509 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.222437 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.222483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.222493 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.222508 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.222517 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: E0131 04:26:58.234283 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: E0131 04:26:58.234445 4812 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.235759 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.235793 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.235806 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.235825 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.235853 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.264095 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.287284 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-17 23:19:47.288127569 +0000 UTC Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.306953 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.338440 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.338487 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.338498 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.338517 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.338530 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.352224 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [cni-plugins bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.387395 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.431672 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.442297 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.442367 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.442389 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.442415 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.442441 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.461472 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.503942 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.517711 4812 generic.go:334] "Generic (PLEG): container finished" podID="258de1b0-7f55-45cb-9ce9-57366ae91c94" containerID="9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b" exitCode=0 Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.517807 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerDied","Data":"9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.524951 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.525009 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.525029 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.525048 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.525064 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.525081 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.542668 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.545673 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.545724 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.545737 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.545756 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.545767 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.586738 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.627999 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.648879 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.648945 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.648970 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.649002 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.649025 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.667661 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.708808 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [bond-cni-plugin routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.744300 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.751552 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.751586 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.751595 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.751613 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.751625 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.782107 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.823813 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.855186 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.855256 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.855275 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.855304 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.855322 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.866204 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.902709 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.944485 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.957696 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.957747 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.957766 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.957791 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:58 crc kubenswrapper[4812]: I0131 04:26:58.957809 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:58Z","lastTransitionTime":"2026-01-31T04:26:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.002010 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:58Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.028521 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.061126 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.061165 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.061175 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.061188 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.061199 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.066519 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.115992 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.164479 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.164521 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.164529 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.164544 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.164555 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.268483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.268956 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.268976 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.269006 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.269024 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.288109 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-08 05:51:30.569643975 +0000 UTC Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.338910 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.339044 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.339050 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.338911 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.339300 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.339532 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.371747 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.371806 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.371825 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.371879 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.371899 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.376570 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.376789 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.376931 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:07.376902709 +0000 UTC m=+35.871924414 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.473929 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.473973 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.473984 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.474002 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.474014 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.477556 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.477668 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477725 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:27:07.477702131 +0000 UTC m=+35.972723806 (durationBeforeRetry 8s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.477768 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.477810 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477854 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477874 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477887 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477923 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477938 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:07.477921987 +0000 UTC m=+35.972943662 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477960 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477972 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:07.477958518 +0000 UTC m=+35.972980193 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477980 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.477992 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:59 crc kubenswrapper[4812]: E0131 04:26:59.478034 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:07.47802571 +0000 UTC m=+35.973047395 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.530664 4812 generic.go:334] "Generic (PLEG): container finished" podID="258de1b0-7f55-45cb-9ce9-57366ae91c94" containerID="7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28" exitCode=0 Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.530726 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerDied","Data":"7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.551738 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.565973 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.576355 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.576391 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.576402 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.576421 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.576433 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.586180 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.604713 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.620289 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [routeoverride-cni whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.632989 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.643779 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.654304 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.666289 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.677515 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.680195 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.680267 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.680289 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.680316 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.680337 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.690730 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.712354 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.727982 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.740665 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.766393 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:26:59Z is after 2025-08-24T17:21:41Z" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.782398 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.782457 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.782480 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.782511 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.782535 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.885646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.885696 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.885713 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.885739 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.885755 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.989060 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.989360 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.989483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.989601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:26:59 crc kubenswrapper[4812]: I0131 04:26:59.989717 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:26:59Z","lastTransitionTime":"2026-01-31T04:26:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.093123 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.093191 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.093214 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.093247 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.093269 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.196477 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.196558 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.196579 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.196611 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.196633 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.288491 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 07:44:07.08006676 +0000 UTC Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.300092 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.300163 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.300186 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.300216 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.300239 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.403995 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.404044 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.404059 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.404079 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.404094 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.505789 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.506185 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.506350 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.506510 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.506879 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.536245 4812 generic.go:334] "Generic (PLEG): container finished" podID="258de1b0-7f55-45cb-9ce9-57366ae91c94" containerID="a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af" exitCode=0 Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.536308 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerDied","Data":"a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.544167 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.558409 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.579258 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.594269 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.609780 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.609862 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.609880 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.609904 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.609921 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.610288 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.633754 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.657173 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.675461 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.689251 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.701409 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.713012 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.713039 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.713047 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.713059 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.713067 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.727489 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.746145 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni-bincopy whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.761926 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.781036 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.800943 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.815574 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.815622 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.815634 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.815652 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.815666 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.816045 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:00Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.918821 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.918912 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.918930 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.918955 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:00 crc kubenswrapper[4812]: I0131 04:27:00.918975 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:00Z","lastTransitionTime":"2026-01-31T04:27:00Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.022345 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.022400 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.022418 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.022442 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.022461 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.125674 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.125736 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.125756 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.125786 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.125802 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.228068 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.228102 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.228114 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.228130 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.228143 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.289318 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 19:10:28.501859704 +0000 UTC Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.330635 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.330676 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.330689 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.330706 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.330720 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.338881 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.338913 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.339021 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:01 crc kubenswrapper[4812]: E0131 04:27:01.339125 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:01 crc kubenswrapper[4812]: E0131 04:27:01.339744 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:01 crc kubenswrapper[4812]: E0131 04:27:01.339882 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.434927 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.434968 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.434979 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.434993 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.435006 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.538219 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.538256 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.538269 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.538285 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.538298 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.551142 4812 generic.go:334] "Generic (PLEG): container finished" podID="258de1b0-7f55-45cb-9ce9-57366ae91c94" containerID="8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35" exitCode=0 Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.551199 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerDied","Data":"8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.586919 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.603732 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.616871 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.642026 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.642088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.642106 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.642134 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.642152 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.645137 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.668736 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.687941 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.706543 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.725611 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.744334 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.744422 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.744435 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.744478 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.744491 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.749298 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.777447 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.792144 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.806321 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.822023 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.841896 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.847308 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.847341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.847349 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.847364 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.847374 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.859066 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:01Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.949991 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.950081 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.950099 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.950126 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:01 crc kubenswrapper[4812]: I0131 04:27:01.950143 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:01Z","lastTransitionTime":"2026-01-31T04:27:01Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.053157 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.053234 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.053258 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.053289 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.053314 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.112263 4812 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.156323 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.156394 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.156419 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.156450 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.156472 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.260033 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.260096 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.260112 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.260136 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.260152 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.289754 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-11 19:26:03.409638129 +0000 UTC Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.360415 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.362799 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.362900 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.362929 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.362956 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.362974 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.384270 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.403065 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.423135 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.451227 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with incomplete status: [whereabouts-cni]\\\",\\\"reason\\\":\\\"ContainersNotInitialized\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.466809 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.467094 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.467229 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.467360 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.467495 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.471482 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.494104 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.510835 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.526051 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.542443 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.559518 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.561023 4812 generic.go:334] "Generic (PLEG): container finished" podID="258de1b0-7f55-45cb-9ce9-57366ae91c94" containerID="f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152" exitCode=0 Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.561086 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerDied","Data":"f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.570451 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.570515 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.570533 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.570556 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.570573 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.590003 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.605053 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.615446 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.637445 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.649643 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.660634 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.672960 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.673052 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.673429 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.673438 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.673451 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.673459 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.684915 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.701575 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.714001 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.734551 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.746751 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.758667 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.776230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.776307 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.776325 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.776411 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.776432 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.786572 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.810977 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus-additional-cni-plugins]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.826392 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.839943 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.853087 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.867691 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:02Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.879333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.879368 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.879380 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.879397 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.879410 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.982549 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.982609 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.982625 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.982652 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:02 crc kubenswrapper[4812]: I0131 04:27:02.982670 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:02Z","lastTransitionTime":"2026-01-31T04:27:02Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.085680 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.085729 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.085748 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.085773 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.085793 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.188752 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.188788 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.188803 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.188823 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.188871 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.289893 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 01:48:26.541245209 +0000 UTC Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.291718 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.291757 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.291773 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.291795 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.291810 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.338932 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:03 crc kubenswrapper[4812]: E0131 04:27:03.339129 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.339666 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:03 crc kubenswrapper[4812]: E0131 04:27:03.339786 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.339885 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:03 crc kubenswrapper[4812]: E0131 04:27:03.339969 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.394312 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.394346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.394354 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.394367 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.394376 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.497278 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.497310 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.497318 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.497330 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.497340 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.567591 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" event={"ID":"258de1b0-7f55-45cb-9ce9-57366ae91c94","Type":"ContainerStarted","Data":"df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.572695 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.572985 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.573006 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.594670 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.600387 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.600450 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.600473 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.600503 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.600521 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.611983 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.622376 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.622831 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.629612 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.659161 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovn-controller ovn-acl-logging kube-rbac-proxy-node kube-rbac-proxy-ovn-metrics northd nbdb sbdb ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"PodInitializing\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.682321 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.701966 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.703156 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.703238 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.703262 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.703291 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.703313 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.721224 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.740435 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.764092 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.785762 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.806282 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.806346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.806363 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.806387 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.806404 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.807278 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.828052 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.844351 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.865615 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.885074 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.901265 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.910406 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.910467 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.910475 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.910493 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.910505 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:03Z","lastTransitionTime":"2026-01-31T04:27:03Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.920767 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.945814 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.970426 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:03 crc kubenswrapper[4812]: I0131 04:27:03.991557 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:03Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.005540 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.013871 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.013921 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.013939 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.013964 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.013983 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.022074 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.041423 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.060446 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.078654 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.095152 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.110509 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.117288 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.117344 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.117365 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.117391 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.117410 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.168057 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.193681 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.209755 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:04Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.219991 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.220024 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.220032 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.220046 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.220055 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.290865 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 21:52:28.561470502 +0000 UTC Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.323518 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.323590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.323612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.323644 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.323665 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.427977 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.428026 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.428043 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.428070 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.428088 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.531140 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.531202 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.531227 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.531256 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.531278 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.577443 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.633666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.633730 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.633751 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.633775 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.633794 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.735877 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.735925 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.735940 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.735959 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.735971 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.837915 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.838007 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.838027 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.838053 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.838073 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.940108 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.940174 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.940193 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.940218 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:04 crc kubenswrapper[4812]: I0131 04:27:04.940237 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:04Z","lastTransitionTime":"2026-01-31T04:27:04Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.042601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.042654 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.042669 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.042685 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.042698 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.145593 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.145661 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.145678 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.145698 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.145716 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.249060 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.249116 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.249136 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.249155 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.249169 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.291469 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-04 15:03:08.771819499 +0000 UTC Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.339076 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.339124 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.339072 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:05 crc kubenswrapper[4812]: E0131 04:27:05.339275 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:05 crc kubenswrapper[4812]: E0131 04:27:05.339403 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:05 crc kubenswrapper[4812]: E0131 04:27:05.339565 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.351950 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.352003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.352019 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.352041 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.352058 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.454596 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.454651 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.454668 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.454693 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.454710 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.557766 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.557815 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.557826 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.557860 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.557874 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.579880 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.660785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.660832 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.660861 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.660878 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.660889 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.763769 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.763808 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.763819 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.763850 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.763862 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.866607 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.866639 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.866649 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.866665 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.866677 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.970092 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.970141 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.970152 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.970169 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:05 crc kubenswrapper[4812]: I0131 04:27:05.970180 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:05Z","lastTransitionTime":"2026-01-31T04:27:05Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.073602 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.073644 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.073656 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.073674 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.073686 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.176200 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.176254 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.176270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.176294 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.176311 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.279498 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.279578 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.279597 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.279759 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.279793 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.291927 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 23:54:00.385161723 +0000 UTC Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.381787 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.381872 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.381901 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.381925 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.381942 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.484814 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.484907 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.484925 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.484952 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.484971 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.586799 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/0.log" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.587708 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.587764 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.587781 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.587804 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.587822 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.591685 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8" exitCode=1 Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.591721 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.592800 4812 scope.go:117] "RemoveContainer" containerID="4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.612221 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.629308 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.646172 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.664765 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.686678 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.690440 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.690487 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.690503 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.690525 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.690543 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.708259 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.733460 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.752483 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.771278 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.793294 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.793327 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.793337 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.793353 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.793365 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.799323 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.817665 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.832135 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.846629 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.861546 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.873746 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:06Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.896362 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.896416 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.896428 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.896445 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.896458 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.998127 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.998186 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.998205 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.998230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:06 crc kubenswrapper[4812]: I0131 04:27:06.998249 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:06Z","lastTransitionTime":"2026-01-31T04:27:06Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.100475 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.100538 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.100559 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.100588 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.100606 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.203117 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.203174 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.203192 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.203212 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.203231 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.292937 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 01:53:18.502970155 +0000 UTC Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.306462 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.306535 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.306559 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.306589 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.306610 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.338912 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.339045 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.339137 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.338935 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.339210 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.339413 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.394340 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.394572 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.394701 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:23.394671938 +0000 UTC m=+51.889693633 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.409295 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.409344 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.409364 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.409388 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.409405 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.495006 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.495155 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495241 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:27:23.495204963 +0000 UTC m=+51.990226668 (durationBeforeRetry 16s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495311 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495340 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495359 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.495401 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495428 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:23.495407019 +0000 UTC m=+51.990428714 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.495460 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495545 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495585 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495601 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:23.495587243 +0000 UTC m=+51.990608938 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495606 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495623 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:07 crc kubenswrapper[4812]: E0131 04:27:07.495667 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:23.495651735 +0000 UTC m=+51.990673440 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.512132 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.512167 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.512179 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.512193 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.512202 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.597061 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/0.log" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.601136 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.601297 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.614427 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.614490 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.614508 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.614532 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.614549 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.623189 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.635266 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.648432 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.666358 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.683216 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.697199 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.717318 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.717567 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.717694 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.717912 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.718094 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.720830 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.739111 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.757444 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.780297 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.798551 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.818452 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.820730 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.820761 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.820770 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.820785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.820795 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.836755 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.851142 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.867237 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:07Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.924164 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.924203 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.924213 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.924230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:07 crc kubenswrapper[4812]: I0131 04:27:07.924239 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:07Z","lastTransitionTime":"2026-01-31T04:27:07Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.028388 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.028757 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.028866 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.028964 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.029062 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.131154 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.131217 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.131235 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.131268 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.131291 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.233544 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.233613 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.233631 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.233654 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.233673 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.285881 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.285946 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.285962 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.285987 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.286005 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.293866 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 11:01:42.001809737 +0000 UTC Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.302586 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.306952 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.306994 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.307006 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.307026 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.307041 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.321673 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.326201 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.326254 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.326272 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.326295 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.326309 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.340912 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.345320 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.345383 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.345397 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.345421 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.345436 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.361345 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.365645 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.365701 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.365723 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.365754 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.365773 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.381791 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.382043 4812 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.390175 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.390219 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.390231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.390249 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.390261 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.493200 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.493255 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.493270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.493295 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.493311 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.595961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.596027 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.596045 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.596069 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.596085 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.606509 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/1.log" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.607316 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/0.log" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.610921 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e" exitCode=1 Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.610973 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.611017 4812 scope.go:117] "RemoveContainer" containerID="4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.612409 4812 scope.go:117] "RemoveContainer" containerID="154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e" Jan 31 04:27:08 crc kubenswrapper[4812]: E0131 04:27:08.612800 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.633931 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.650100 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.670038 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.688661 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.699296 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.699349 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.699366 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.699389 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.699407 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.707203 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.725267 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.737982 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.753743 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.766390 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.780735 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.799081 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.801616 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.801650 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.801661 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.801677 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.801689 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.814349 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.838496 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.873096 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.890398 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:08Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.904452 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.904511 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.904530 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.904556 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:08 crc kubenswrapper[4812]: I0131 04:27:08.904573 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:08Z","lastTransitionTime":"2026-01-31T04:27:08Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.008038 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.008284 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.008460 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.008641 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.008826 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.111831 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.112080 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.112215 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.112353 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.112479 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.215581 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.215646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.215666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.215691 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.215708 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.294263 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 16:41:15.675405647 +0000 UTC Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.319251 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.319316 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.319335 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.319358 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.319375 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.338559 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:09 crc kubenswrapper[4812]: E0131 04:27:09.338712 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.339103 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.339162 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:09 crc kubenswrapper[4812]: E0131 04:27:09.339306 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:09 crc kubenswrapper[4812]: E0131 04:27:09.339427 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.421890 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.421926 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.421935 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.421950 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.421964 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.508835 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25"] Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.509504 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.512437 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.512524 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.524302 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.524362 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.524379 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.524403 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.524420 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.532755 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.549475 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.579971 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.614781 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lc2jk\" (UniqueName: \"kubernetes.io/projected/9c158521-712e-4c94-8acf-5244e32666a5-kube-api-access-lc2jk\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.614920 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c158521-712e-4c94-8acf-5244e32666a5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.615058 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c158521-712e-4c94-8acf-5244e32666a5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.615162 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c158521-712e-4c94-8acf-5244e32666a5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.617530 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.618570 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/1.log" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.627203 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.627322 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.627342 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.627367 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.627386 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.641312 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.660820 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.680185 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.699605 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.716230 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c158521-712e-4c94-8acf-5244e32666a5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.716311 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c158521-712e-4c94-8acf-5244e32666a5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.716381 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c158521-712e-4c94-8acf-5244e32666a5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.716423 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lc2jk\" (UniqueName: \"kubernetes.io/projected/9c158521-712e-4c94-8acf-5244e32666a5-kube-api-access-lc2jk\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.717375 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/9c158521-712e-4c94-8acf-5244e32666a5-env-overrides\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.717598 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/9c158521-712e-4c94-8acf-5244e32666a5-ovnkube-config\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.723128 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.727272 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-control-plane-metrics-cert\" (UniqueName: \"kubernetes.io/secret/9c158521-712e-4c94-8acf-5244e32666a5-ovn-control-plane-metrics-cert\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.730682 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.730729 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.730745 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.730769 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.730787 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.746144 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.747200 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lc2jk\" (UniqueName: \"kubernetes.io/projected/9c158521-712e-4c94-8acf-5244e32666a5-kube-api-access-lc2jk\") pod \"ovnkube-control-plane-749d76644c-r9j25\" (UID: \"9c158521-712e-4c94-8acf-5244e32666a5\") " pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.765831 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.784390 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.803662 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-rbac-proxy ovnkube-cluster-manager]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.824228 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.826411 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.834804 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.834913 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.834941 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.834971 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.835012 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:09 crc kubenswrapper[4812]: W0131 04:27:09.848424 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9c158521_712e_4c94_8acf_5244e32666a5.slice/crio-0f4fc7014dafb101ccdd41df36b738d995c3697be87d816cb59071ce2d73a79d WatchSource:0}: Error finding container 0f4fc7014dafb101ccdd41df36b738d995c3697be87d816cb59071ce2d73a79d: Status 404 returned error can't find the container with id 0f4fc7014dafb101ccdd41df36b738d995c3697be87d816cb59071ce2d73a79d Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.850222 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.870318 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:09Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.937661 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.937705 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.937714 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.937728 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:09 crc kubenswrapper[4812]: I0131 04:27:09.937737 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:09Z","lastTransitionTime":"2026-01-31T04:27:09Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.040190 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.040251 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.040269 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.040292 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.040309 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.142489 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.142544 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.142562 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.142587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.142606 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.245819 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.245905 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.245924 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.245949 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.245966 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.294507 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-19 03:49:16.084186448 +0000 UTC Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.348666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.348720 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.348737 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.348761 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.348779 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.451956 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.451992 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.452002 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.452018 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.452029 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.555549 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.555996 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.556186 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.556368 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.556551 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.629584 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" event={"ID":"9c158521-712e-4c94-8acf-5244e32666a5","Type":"ContainerStarted","Data":"40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.629668 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" event={"ID":"9c158521-712e-4c94-8acf-5244e32666a5","Type":"ContainerStarted","Data":"9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.629688 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" event={"ID":"9c158521-712e-4c94-8acf-5244e32666a5","Type":"ContainerStarted","Data":"0f4fc7014dafb101ccdd41df36b738d995c3697be87d816cb59071ce2d73a79d"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.642885 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-wg68w"] Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.643530 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:10 crc kubenswrapper[4812]: E0131 04:27:10.643627 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.649731 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.659762 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.659811 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.659829 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.659904 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.659925 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.671040 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.704569 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.727344 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.727406 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zj6tv\" (UniqueName: \"kubernetes.io/projected/2c369253-313a-484c-bc8a-dae99abab086-kube-api-access-zj6tv\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.745093 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.759523 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.762016 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.762137 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.762254 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.762338 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.762423 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.775991 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.786626 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.802951 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.815394 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.828219 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.828484 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zj6tv\" (UniqueName: \"kubernetes.io/projected/2c369253-313a-484c-bc8a-dae99abab086-kube-api-access-zj6tv\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.828531 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: E0131 04:27:10.828681 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:10 crc kubenswrapper[4812]: E0131 04:27:10.829004 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:11.328982819 +0000 UTC m=+39.824004484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.848438 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zj6tv\" (UniqueName: \"kubernetes.io/projected/2c369253-313a-484c-bc8a-dae99abab086-kube-api-access-zj6tv\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.853759 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.864482 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.864513 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.864522 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.864535 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.864545 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.867047 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.888511 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.904096 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.920204 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.968740 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.968792 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.968804 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.968823 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.968835 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:10Z","lastTransitionTime":"2026-01-31T04:27:10Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.972615 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:10 crc kubenswrapper[4812]: I0131 04:27:10.987937 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:10Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.004344 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.022176 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.038449 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.060007 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.071319 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.071493 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.071556 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.071630 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.071697 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.075430 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.088333 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.108760 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.123078 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.137107 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.156930 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.169989 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.174260 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.174471 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.174639 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.174800 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.174966 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.186610 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.202436 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.216136 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.247967 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.277338 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.277426 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.277443 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.277468 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.277487 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.282745 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:11Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.295697 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 17:56:38.123400985 +0000 UTC Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.339076 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.339114 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.339143 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:11 crc kubenswrapper[4812]: E0131 04:27:11.339676 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:11 crc kubenswrapper[4812]: E0131 04:27:11.339735 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:11 crc kubenswrapper[4812]: E0131 04:27:11.339447 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.372668 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:11 crc kubenswrapper[4812]: E0131 04:27:11.373005 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:11 crc kubenswrapper[4812]: E0131 04:27:11.373136 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:12.373105078 +0000 UTC m=+40.868126773 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.380397 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.380584 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.380718 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.380879 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.381028 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.484523 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.484575 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.484590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.484608 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.484621 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.588609 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.588663 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.588679 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.588703 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.588721 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.692280 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.692329 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.692341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.692357 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.692369 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.795174 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.795230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.795248 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.795273 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.795290 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.898229 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.898276 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.898286 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.898306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:11 crc kubenswrapper[4812]: I0131 04:27:11.898317 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:11Z","lastTransitionTime":"2026-01-31T04:27:11Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.001542 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.001612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.001636 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.001668 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.001690 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.105112 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.105185 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.105203 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.105231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.105248 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.208164 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.208231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.208250 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.208276 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.208293 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.296670 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 03:16:35.444176866 +0000 UTC Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.311218 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.311288 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.311306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.311331 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.311348 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.339127 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:12 crc kubenswrapper[4812]: E0131 04:27:12.339311 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.371327 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.383210 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:12 crc kubenswrapper[4812]: E0131 04:27:12.383485 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:12 crc kubenswrapper[4812]: E0131 04:27:12.383629 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:14.383600838 +0000 UTC m=+42.878622533 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.387207 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.401156 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.414526 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.414581 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.414596 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.414619 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.414636 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.434403 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://4dd63b3861e407be61c03c2e397b260c84e4b3b1dcfc57f375576ba7c606a4a8\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:06Z\\\",\\\"message\\\":\\\"inednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.198984 6106 handler.go:208] Removed *v1.Node event handler 7\\\\nI0131 04:27:06.199281 6106 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:06.199302 6106 handler.go:190] Sending *v1.Namespace event handler 1 for removal\\\\nI0131 04:27:06.199006 6106 reflector.go:311] Stopping reflector *v1.Namespace (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:06.199328 6106 handler.go:190] Sending *v1.Namespace event handler 5 for removal\\\\nI0131 04:27:06.199355 6106 factory.go:656] Stopping watch factory\\\\nI0131 04:27:06.199383 6106 handler.go:208] Removed *v1.Namespace event handler 5\\\\nI0131 04:27:06.199412 6106 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:06.199480 6106 handler.go:208] Removed *v1.Node event handler 2\\\\nI0131 04:27:06.199179 6106 reflector.go:311] Stopping reflector *v1.EgressIP (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressip/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199226 6106 reflector.go:311] Stopping reflector *v1.ClusterUserDefinedNetwork (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/userdefinednetwork/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:06.199524 6106 handler.go:208] Removed *v1.Namespace ev\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.457558 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.488211 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.512063 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.517141 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.517299 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.517376 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.517447 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.517505 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.531444 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.549795 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.566222 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.580179 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.599011 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.617392 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.620055 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.620110 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.620136 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.620165 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.620184 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.632007 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.650146 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.669121 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.685931 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:12Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.723273 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.723335 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.723352 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.723376 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.723394 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.826529 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.826586 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.826603 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.826627 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.826644 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.930184 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.930249 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.930264 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.930291 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:12 crc kubenswrapper[4812]: I0131 04:27:12.930308 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:12Z","lastTransitionTime":"2026-01-31T04:27:12Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.034231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.034301 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.034320 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.034344 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.034362 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.137356 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.137426 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.137444 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.137473 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.137490 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.275510 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.275572 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.275585 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.275606 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.275620 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.297282 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 05:44:21.000719406 +0000 UTC Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.338750 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.338893 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:13 crc kubenswrapper[4812]: E0131 04:27:13.338941 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.338996 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:13 crc kubenswrapper[4812]: E0131 04:27:13.339121 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:13 crc kubenswrapper[4812]: E0131 04:27:13.339340 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.378753 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.378807 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.378829 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.378897 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.378916 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.481523 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.481591 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.481609 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.481641 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.481659 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.585502 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.585591 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.585618 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.585649 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.585670 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.688723 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.688796 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.688823 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.688892 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.688916 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.791959 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.792014 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.792034 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.792057 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.792072 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.894976 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.895033 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.895050 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.895074 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.895095 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.997790 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.997891 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.997916 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.997951 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:13 crc kubenswrapper[4812]: I0131 04:27:13.997971 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:13Z","lastTransitionTime":"2026-01-31T04:27:13Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.100960 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.101019 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.101034 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.101060 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.101077 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.203670 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.203730 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.203747 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.203769 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.203786 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.298325 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-20 05:54:57.832862952 +0000 UTC Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.307109 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.307162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.307178 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.307201 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.307218 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.338799 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:14 crc kubenswrapper[4812]: E0131 04:27:14.339056 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.403770 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:14 crc kubenswrapper[4812]: E0131 04:27:14.404089 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:14 crc kubenswrapper[4812]: E0131 04:27:14.404245 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:18.404207237 +0000 UTC m=+46.899228962 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.411401 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.411460 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.411478 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.411503 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.411521 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.514777 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.514833 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.514890 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.514919 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.514944 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.617882 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.617972 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.617997 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.618030 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.618053 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.720325 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.720375 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.720388 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.720407 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.720421 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.823298 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.823333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.823343 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.823361 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.823373 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.926294 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.926356 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.926375 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.926402 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:14 crc kubenswrapper[4812]: I0131 04:27:14.926418 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:14Z","lastTransitionTime":"2026-01-31T04:27:14Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.029193 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.029282 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.029306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.029341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.029366 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.132357 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.132487 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.132510 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.132536 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.132553 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.236123 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.236179 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.236195 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.236217 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.236234 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.298982 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-31 00:34:15.063922888 +0000 UTC Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338135 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338184 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338203 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338225 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338241 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338584 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338665 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.338671 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:15 crc kubenswrapper[4812]: E0131 04:27:15.338800 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:15 crc kubenswrapper[4812]: E0131 04:27:15.338954 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:15 crc kubenswrapper[4812]: E0131 04:27:15.339074 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.441130 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.441201 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.441223 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.441250 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.441269 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.544635 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.544667 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.544675 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.544689 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.544698 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.648170 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.648565 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.648735 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.648936 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.649079 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.751433 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.751771 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.751993 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.752191 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.752317 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.855690 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.855746 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.855763 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.855785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.855804 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.958744 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.958874 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.958899 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.958961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:15 crc kubenswrapper[4812]: I0131 04:27:15.958983 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:15Z","lastTransitionTime":"2026-01-31T04:27:15Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.062589 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.062648 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.062670 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.062697 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.062717 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.166066 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.166136 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.166162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.166193 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.166216 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.268693 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.268748 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.268766 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.268791 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.268809 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.299166 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 03:41:37.2428982 +0000 UTC Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.339016 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:16 crc kubenswrapper[4812]: E0131 04:27:16.339254 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.372320 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.372404 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.372425 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.372450 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.372468 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.476085 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.476151 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.476167 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.476188 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.476204 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.579358 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.579418 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.579434 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.579457 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.579474 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.683880 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.683962 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.683981 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.684011 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.684035 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.787551 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.787633 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.787658 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.787692 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.787716 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.890335 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.890400 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.890419 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.890446 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.890463 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.992805 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.992919 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.992944 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.992972 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:16 crc kubenswrapper[4812]: I0131 04:27:16.992994 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:16Z","lastTransitionTime":"2026-01-31T04:27:16Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.096127 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.096198 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.096220 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.096247 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.096270 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.199034 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.199158 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.199185 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.199215 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.199238 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.299611 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-07 16:16:04.773712964 +0000 UTC Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.301794 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.301877 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.301894 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.301917 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.301936 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.338571 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.338614 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.338566 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:17 crc kubenswrapper[4812]: E0131 04:27:17.338801 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:17 crc kubenswrapper[4812]: E0131 04:27:17.338723 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:17 crc kubenswrapper[4812]: E0131 04:27:17.338989 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.405062 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.405153 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.405186 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.405219 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.405242 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.508266 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.508325 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.508341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.508365 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.508382 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.611072 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.611121 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.611133 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.611154 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.611168 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.714541 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.714595 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.714611 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.714632 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.714652 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.816952 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.817017 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.817035 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.817059 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.817077 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.920486 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.920565 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.920590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.920618 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:17 crc kubenswrapper[4812]: I0131 04:27:17.920638 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:17Z","lastTransitionTime":"2026-01-31T04:27:17Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.024504 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.024586 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.024608 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.024639 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.024662 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.128428 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.128504 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.128526 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.128549 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.128569 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.231336 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.231421 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.231444 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.231487 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.231509 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.300835 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-16 17:09:47.031065225 +0000 UTC Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.334182 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.334235 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.334254 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.334278 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.334297 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.340117 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.340326 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.435889 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.435948 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.435972 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.436003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.436023 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.449487 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.449699 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.449795 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:26.449770285 +0000 UTC m=+54.944792050 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.457302 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:18Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.462330 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.462588 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.462718 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.462886 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.463068 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.482943 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:18Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.488271 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.488512 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.488689 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.488898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.489079 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.508963 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:18Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.513480 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.513667 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.513824 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.514049 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.514242 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.534897 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:18Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.539538 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.539604 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.539628 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.539660 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.539686 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.563355 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:18Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"},\\\"runtimeHandlers\\\":[{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":false},\\\"name\\\":\\\"runc\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"crun\\\"},{\\\"features\\\":{\\\"recursiveReadOnlyMounts\\\":true,\\\"userNamespaces\\\":true},\\\"name\\\":\\\"\\\"}]}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:18Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:18 crc kubenswrapper[4812]: E0131 04:27:18.563660 4812 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.565928 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.565979 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.565994 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.566016 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.566031 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.668405 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.668484 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.668503 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.668527 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.668546 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.771035 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.771069 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.771079 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.771092 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.771102 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.874088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.874144 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.874166 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.874195 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.874215 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.977346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.977399 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.977415 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.977439 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:18 crc kubenswrapper[4812]: I0131 04:27:18.977455 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:18Z","lastTransitionTime":"2026-01-31T04:27:18Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.080289 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.080355 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.080381 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.080409 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.080431 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.182922 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.182987 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.183006 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.183029 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.183047 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.285622 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.285672 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.285688 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.285710 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.285727 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.301295 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 06:12:12.554156951 +0000 UTC Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.338609 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.338637 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.338646 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:19 crc kubenswrapper[4812]: E0131 04:27:19.338769 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:19 crc kubenswrapper[4812]: E0131 04:27:19.339011 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:19 crc kubenswrapper[4812]: E0131 04:27:19.339197 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.389462 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.389527 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.389544 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.389571 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.389589 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.492682 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.492752 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.492776 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.492805 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.492827 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.596003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.596064 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.596084 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.596106 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.596122 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.627475 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.628398 4812 scope.go:117] "RemoveContainer" containerID="154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.652532 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.672211 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.686355 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.698571 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.698621 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.698633 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.698648 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.698659 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.714502 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":1,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 10s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.733909 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.755086 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.775483 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.797300 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.801474 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.801535 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.801549 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.801587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.801599 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.824518 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.840514 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.855658 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.873528 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.889404 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.902189 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.909236 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.909282 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.909295 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.909318 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.909331 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:19Z","lastTransitionTime":"2026-01-31T04:27:19Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.918826 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.933551 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:19 crc kubenswrapper[4812]: I0131 04:27:19.947967 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:19Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.012528 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.012573 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.012590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.012612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.012627 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.115304 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.115348 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.115361 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.115378 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.115392 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.217722 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.217771 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.217784 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.217802 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.217814 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.301682 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 08:41:11.670444045 +0000 UTC Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.320099 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.320149 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.320160 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.320176 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.320187 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.339484 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:20 crc kubenswrapper[4812]: E0131 04:27:20.339623 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.422413 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.422465 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.422481 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.422502 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.422517 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.524516 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.524569 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.524581 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.524600 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.524613 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.627027 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.627096 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.627119 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.627151 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.627173 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.677425 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/1.log" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.680193 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.680649 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.700081 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.721055 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.729455 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.729486 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.729495 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.729509 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.729518 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.738318 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.756076 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.770816 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.783564 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.799542 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.809994 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.822646 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.833893 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.833925 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.833935 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.833949 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.833960 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.836707 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.849247 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.862748 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.874683 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.903351 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.919438 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.932788 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.936387 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.936507 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.936597 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.936689 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.936749 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:20Z","lastTransitionTime":"2026-01-31T04:27:20Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:20 crc kubenswrapper[4812]: I0131 04:27:20.950534 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:20Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.039513 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.039599 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.039618 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.039642 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.039659 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.141921 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.141979 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.141996 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.142020 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.142040 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.245178 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.245237 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.245258 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.245280 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.245297 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.303037 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-21 06:02:49.039681898 +0000 UTC Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.338735 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.338813 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.338735 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:21 crc kubenswrapper[4812]: E0131 04:27:21.338980 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:21 crc kubenswrapper[4812]: E0131 04:27:21.339126 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:21 crc kubenswrapper[4812]: E0131 04:27:21.339295 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.348201 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.348248 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.348265 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.348286 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.348304 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.452039 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.452105 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.452123 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.452148 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.452165 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.554938 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.555003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.555021 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.555043 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.555059 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.657543 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.657908 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.658094 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.658250 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.658386 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.685454 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/2.log" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.686110 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/1.log" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.689443 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135" exitCode=1 Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.689486 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.689541 4812 scope.go:117] "RemoveContainer" containerID="154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.690374 4812 scope.go:117] "RemoveContainer" containerID="c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135" Jan 31 04:27:21 crc kubenswrapper[4812]: E0131 04:27:21.690568 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.713937 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.731775 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.750636 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.763693 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.763755 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.763779 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.763809 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.763834 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.768647 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.783737 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.800555 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.815530 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.835858 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.857674 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.867637 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.867686 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.867698 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.867717 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.867732 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.870641 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.883692 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.899923 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.918921 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.950264 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.968111 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.970174 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.970233 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.970247 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.970270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.970285 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:21Z","lastTransitionTime":"2026-01-31T04:27:21Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:21 crc kubenswrapper[4812]: I0131 04:27:21.983591 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:21Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.013584 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.073128 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.073176 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.073185 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.073200 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.073209 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.175986 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.176031 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.176040 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.176057 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.176068 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.279334 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.279374 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.279386 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.279404 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.279416 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.305035 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-06 21:39:41.342272452 +0000 UTC Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.338888 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:22 crc kubenswrapper[4812]: E0131 04:27:22.339072 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.369266 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.382361 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.382401 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.382420 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.382437 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.382450 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.393871 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.428864 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.440182 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.448851 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.466934 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://154cb03a3b45754d95c53264c4553a5071afa425db8281ec10163bdee7db1a2e\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:08Z\\\",\\\"message\\\":\\\"/crd/egressfirewall/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529085 6234 reflector.go:311] Stopping reflector *v1.NetworkPolicy (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.529132 6234 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529163 6234 reflector.go:311] Stopping reflector *v1.EgressService (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressservice/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:07.529229 6234 reflector.go:311] Stopping reflector *v1.Service (0s) from k8s.io/client-go/informers/factory.go:160\\\\nI0131 04:27:07.530590 6234 handler.go:190] Sending *v1.Pod event handler 3 for removal\\\\nI0131 04:27:07.530687 6234 handler.go:190] Sending *v1.Pod event handler 6 for removal\\\\nI0131 04:27:07.530794 6234 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:07.530830 6234 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:07.530916 6234 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:07.530987 6234 factory.go:656] Stopping watch factory\\\\nI0131 04:27:07.531014 6234 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:06Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.485017 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.485080 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.485098 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.485124 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.485142 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.486635 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.499423 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.515195 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.531975 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.551679 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.568024 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.584639 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.587906 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.587968 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.587986 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.588010 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.588028 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.597552 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.610895 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.628434 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.644546 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.690128 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.690173 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.690184 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.690204 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.690217 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.694704 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/2.log" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.698829 4812 scope.go:117] "RemoveContainer" containerID="c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135" Jan 31 04:27:22 crc kubenswrapper[4812]: E0131 04:27:22.699024 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.717960 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.730564 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.757037 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.781287 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.793508 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.793564 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.793583 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.793608 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.793627 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.800022 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.816169 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.832384 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.849103 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.870872 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.887451 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.896172 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.896248 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.896270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.896300 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.896323 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.901900 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.916604 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.932602 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.947284 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.967221 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.985233 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.999362 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.999409 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.999426 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.999448 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:22 crc kubenswrapper[4812]: I0131 04:27:22.999464 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:22Z","lastTransitionTime":"2026-01-31T04:27:22Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.001723 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:22Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.102494 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.102555 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.102572 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.102596 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.102614 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.206689 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.206754 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.206770 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.206798 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.206818 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.306059 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-23 10:13:34.975864883 +0000 UTC Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.309493 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.309537 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.309556 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.309581 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.309600 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.339327 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.339417 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.339476 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.339646 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.339877 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.340057 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.399302 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.399502 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.399601 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:55.399576515 +0000 UTC m=+83.894598210 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.412237 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.412319 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.412338 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.412364 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.412384 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.500280 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.500550 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:27:55.500516721 +0000 UTC m=+83.995538406 (durationBeforeRetry 32s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.500658 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.500713 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.500796 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.500902 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.500988 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501008 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501023 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501054 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501109 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501129 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.500994 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:55.500970812 +0000 UTC m=+83.995992507 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501216 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:55.501199048 +0000 UTC m=+83.996220723 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:23 crc kubenswrapper[4812]: E0131 04:27:23.501248 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:55.50123609 +0000 UTC m=+83.996257895 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.515799 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.515898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.515917 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.515946 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.515972 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.619353 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.619412 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.619429 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.619457 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.619476 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.722739 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.722783 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.722794 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.722811 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.722822 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.827388 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.827477 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.827494 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.827518 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.827536 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.930103 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.930151 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.930162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.930183 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:23 crc kubenswrapper[4812]: I0131 04:27:23.930195 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:23Z","lastTransitionTime":"2026-01-31T04:27:23Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.032722 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.032767 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.032780 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.032798 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.032809 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.135309 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.135357 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.135377 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.135405 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.135428 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.238977 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.239037 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.239055 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.239078 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.239095 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.306456 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 08:19:27.910286527 +0000 UTC Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.339392 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:24 crc kubenswrapper[4812]: E0131 04:27:24.339545 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.341888 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.341941 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.341958 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.341979 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.341995 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.444298 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.444351 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.444366 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.444382 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.444395 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.546799 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.546894 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.546913 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.546941 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.546965 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.649488 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.649554 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.649572 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.649596 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.649613 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.752303 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.752346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.752355 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.752369 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.752379 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.855802 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.856064 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.856130 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.856197 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.856262 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.961708 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.961779 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.961798 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.961821 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:24 crc kubenswrapper[4812]: I0131 04:27:24.961918 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:24Z","lastTransitionTime":"2026-01-31T04:27:24Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.064341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.064418 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.064441 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.064469 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.064490 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.167874 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.167933 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.167946 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.167965 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.167978 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.271511 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.271572 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.271589 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.271611 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.271629 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.307102 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-02 10:34:34.013834202 +0000 UTC Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.338696 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.338729 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.338770 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:25 crc kubenswrapper[4812]: E0131 04:27:25.338902 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:25 crc kubenswrapper[4812]: E0131 04:27:25.339054 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:25 crc kubenswrapper[4812]: E0131 04:27:25.339168 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.377244 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.377624 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.377657 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.377688 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.377713 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.480379 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.480465 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.480509 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.480537 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.480557 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.582928 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.582985 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.583002 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.583027 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.583044 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.685739 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.685807 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.685826 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.685888 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.685910 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.788984 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.789049 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.789065 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.789089 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.789106 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.891972 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.892072 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.892089 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.892113 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.892131 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.994786 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.994867 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.994883 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.994905 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:25 crc kubenswrapper[4812]: I0131 04:27:25.994922 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:25Z","lastTransitionTime":"2026-01-31T04:27:25Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.097830 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.097917 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.097935 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.097961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.097978 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.201261 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.201322 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.201341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.201365 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.201383 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.304358 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.304429 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.304446 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.304876 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.304929 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.307602 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-20 11:01:11.130155398 +0000 UTC Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.339410 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:26 crc kubenswrapper[4812]: E0131 04:27:26.339572 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.408258 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.408309 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.408327 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.408349 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.408368 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.511080 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.511135 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.511152 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.511174 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.511190 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.535891 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:26 crc kubenswrapper[4812]: E0131 04:27:26.536137 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:26 crc kubenswrapper[4812]: E0131 04:27:26.536279 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:27:42.536250315 +0000 UTC m=+71.031272020 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.614501 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.614572 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.614590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.614614 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.614633 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.717233 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.717289 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.717306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.717331 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.717353 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.820577 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.820630 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.820647 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.820672 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.820689 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.923550 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.923602 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.923619 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.923645 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:26 crc kubenswrapper[4812]: I0131 04:27:26.923662 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:26Z","lastTransitionTime":"2026-01-31T04:27:26Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.026522 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.026581 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.026598 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.026623 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.026640 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.135654 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.136044 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.136140 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.136231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.136333 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.239594 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.239639 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.239649 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.239666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.239678 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.308412 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-23 12:10:54.990145446 +0000 UTC Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.338740 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.338916 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:27 crc kubenswrapper[4812]: E0131 04:27:27.339088 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.339135 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:27 crc kubenswrapper[4812]: E0131 04:27:27.339269 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:27 crc kubenswrapper[4812]: E0131 04:27:27.339395 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.342600 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.343016 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.343270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.343827 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.344342 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.447750 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.448030 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.448050 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.448074 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.448090 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.519264 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.534675 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler/openshift-kube-scheduler-crc"] Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.550635 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.550676 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.550688 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.550725 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.550737 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.558065 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.591911 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.609784 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.624547 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.642267 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.653790 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.653830 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.653858 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.653873 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.653887 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.661188 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.680367 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.701256 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.720725 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.738196 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.756000 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.757281 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.757356 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.757381 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.757415 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.757439 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.775221 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.797126 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.818033 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.834738 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.855898 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.860739 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.860804 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.860829 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.860904 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.860922 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.874470 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:27Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.963267 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.963303 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.963313 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.963330 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:27 crc kubenswrapper[4812]: I0131 04:27:27.963340 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:27Z","lastTransitionTime":"2026-01-31T04:27:27Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.065939 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.066008 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.066020 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.066038 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.066051 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.168611 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.168662 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.168679 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.168703 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.168720 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.272093 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.272162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.272185 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.272214 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.272236 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.308967 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-24 13:37:41.929376647 +0000 UTC Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.339605 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.339891 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.374964 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.375010 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.375022 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.375039 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.375051 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.477608 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.477666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.477690 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.477718 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.477740 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.581198 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.581277 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.581297 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.581323 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.581342 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.630529 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.630597 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.630616 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.630643 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.630661 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.651632 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:28Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.657968 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.658026 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.658045 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.658069 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.658087 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.681986 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:28Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.686535 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.686609 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.686629 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.686654 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.686672 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.705260 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:28Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.709436 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.709517 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.709534 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.709561 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.709578 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.728566 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:28Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.733471 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.733534 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.733551 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.733577 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.733595 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.752618 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:28Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:28Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:28 crc kubenswrapper[4812]: E0131 04:27:28.752981 4812 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.754961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.755003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.755016 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.755031 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.755042 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.858205 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.858259 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.858275 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.858297 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.858318 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.961921 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.961984 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.962001 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.962028 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:28 crc kubenswrapper[4812]: I0131 04:27:28.962050 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:28Z","lastTransitionTime":"2026-01-31T04:27:28Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.064795 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.064893 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.064911 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.064934 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.064952 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.168543 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.168600 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.168620 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.168644 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.168660 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.271552 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.271621 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.271638 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.271662 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.271684 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.309941 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-27 14:15:28.168566666 +0000 UTC Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.339685 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.339739 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.339739 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:29 crc kubenswrapper[4812]: E0131 04:27:29.339994 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:29 crc kubenswrapper[4812]: E0131 04:27:29.340155 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:29 crc kubenswrapper[4812]: E0131 04:27:29.340294 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.374333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.374414 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.374430 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.374447 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.374460 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.477148 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.477206 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.477222 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.477246 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.477262 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.579646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.579706 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.579721 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.579739 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.579752 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.682403 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.682447 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.682457 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.682472 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.682482 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.786016 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.786055 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.786065 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.786079 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.786088 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.888328 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.888405 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.888417 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.888434 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.888446 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.992334 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.992737 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.992938 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.993105 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:29 crc kubenswrapper[4812]: I0131 04:27:29.993262 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:29Z","lastTransitionTime":"2026-01-31T04:27:29Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.095890 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.096730 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.097013 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.097506 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.097662 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.200136 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.200215 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.200232 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.200257 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.200274 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.302457 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.302801 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.303049 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.303255 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.303429 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.310721 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-07 17:38:59.26238207 +0000 UTC Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.339071 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:30 crc kubenswrapper[4812]: E0131 04:27:30.339327 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.406957 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.407058 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.407078 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.407101 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.407118 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.510152 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.510360 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.510530 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.510671 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.510808 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.613086 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.613128 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.613140 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.613154 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.613166 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.716169 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.716227 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.716237 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.716256 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.716268 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.818737 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.818773 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.818783 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.818797 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.818805 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.920690 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.920738 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.920757 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.920779 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:30 crc kubenswrapper[4812]: I0131 04:27:30.920825 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:30Z","lastTransitionTime":"2026-01-31T04:27:30Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.023666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.023715 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.023735 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.023759 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.023777 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.126485 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.126517 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.126526 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.126539 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.126548 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.229317 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.229361 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.229370 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.229385 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.229394 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.311409 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-10 08:01:13.059490786 +0000 UTC Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.331758 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.331812 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.331821 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.331856 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.331871 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.339136 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.339199 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.339207 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:31 crc kubenswrapper[4812]: E0131 04:27:31.339335 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:31 crc kubenswrapper[4812]: E0131 04:27:31.339497 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:31 crc kubenswrapper[4812]: E0131 04:27:31.339683 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.435087 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.435157 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.435178 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.435207 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.435232 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.538124 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.538188 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.538208 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.538232 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.538249 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.641604 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.641645 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.641657 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.641674 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.641687 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.744453 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.744531 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.744554 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.744582 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.744603 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.847906 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.847984 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.848008 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.848037 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.848056 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.951310 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.951356 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.951368 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.951385 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:31 crc kubenswrapper[4812]: I0131 04:27:31.951397 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:31Z","lastTransitionTime":"2026-01-31T04:27:31Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.053805 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.053928 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.053951 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.053979 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.054007 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.155935 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.155968 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.155976 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.155987 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.155996 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.262949 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.263162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.263223 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.263271 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.263300 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.311906 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-13 12:32:56.978860891 +0000 UTC Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.338735 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:32 crc kubenswrapper[4812]: E0131 04:27:32.338954 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.355657 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.365895 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.365953 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.365971 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.365995 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.366014 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.375967 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.393791 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.409644 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.424437 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.441741 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.461319 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.469951 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.470123 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.470153 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.470188 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.470225 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.484099 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.498755 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.508948 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.522156 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.538629 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.554622 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.571834 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.573139 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.573199 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.573222 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.573254 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.573332 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.605776 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.621283 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.634525 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.660315 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:32Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.676604 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.676669 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.676686 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.676709 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.676728 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.779478 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.779560 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.779578 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.779602 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.779622 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.883100 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.883207 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.883234 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.883264 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.883286 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.987078 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.987167 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.987188 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.987214 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:32 crc kubenswrapper[4812]: I0131 04:27:32.987233 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:32Z","lastTransitionTime":"2026-01-31T04:27:32Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.089485 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.089898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.090258 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.090492 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.090701 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.193547 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.193612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.193636 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.193666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.193688 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.297285 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.297333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.297349 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.297371 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.297389 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.312203 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-11 19:49:54.522929438 +0000 UTC Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.338452 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.338457 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.338542 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:33 crc kubenswrapper[4812]: E0131 04:27:33.338648 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:33 crc kubenswrapper[4812]: E0131 04:27:33.338880 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:33 crc kubenswrapper[4812]: E0131 04:27:33.338924 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.400208 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.400280 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.400305 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.400334 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.400356 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.503488 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.503555 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.503573 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.503600 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.503619 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.607197 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.607251 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.607272 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.607307 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.607326 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.709749 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.709829 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.709889 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.709922 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.709944 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.812223 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.812277 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.812288 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.812308 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.812320 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.915554 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.915624 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.915649 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.915679 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:33 crc kubenswrapper[4812]: I0131 04:27:33.915702 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:33Z","lastTransitionTime":"2026-01-31T04:27:33Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.018436 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.018479 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.018487 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.018501 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.018546 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.120923 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.120972 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.120990 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.121012 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.121029 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.225685 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.225722 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.225730 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.225744 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.225753 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.312362 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-07 10:23:33.388438351 +0000 UTC Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.329173 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.329231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.329247 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.329270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.329289 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.338993 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:34 crc kubenswrapper[4812]: E0131 04:27:34.339631 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.340177 4812 scope.go:117] "RemoveContainer" containerID="c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135" Jan 31 04:27:34 crc kubenswrapper[4812]: E0131 04:27:34.341388 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.432730 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.432974 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.432996 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.433024 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.433042 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.536745 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.536801 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.536820 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.536881 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.536902 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.640883 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.641213 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.641340 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.641558 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.641813 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.743390 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.743430 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.743441 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.743457 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.743469 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.845774 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.845830 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.845884 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.845908 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.845939 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.949128 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.949703 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.949922 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.950162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:34 crc kubenswrapper[4812]: I0131 04:27:34.950314 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:34Z","lastTransitionTime":"2026-01-31T04:27:34Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.052824 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.052928 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.052956 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.052988 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.053011 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.155892 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.155949 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.155966 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.155989 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.156005 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.261379 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.261425 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.261445 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.261467 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.261483 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.313089 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-09 22:07:19.095459925 +0000 UTC Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.339027 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.339120 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.339044 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:35 crc kubenswrapper[4812]: E0131 04:27:35.339218 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:35 crc kubenswrapper[4812]: E0131 04:27:35.339308 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:35 crc kubenswrapper[4812]: E0131 04:27:35.339473 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.364127 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.364199 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.364225 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.364254 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.364276 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.467408 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.467464 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.467480 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.467505 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.467550 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.577256 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.577341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.577496 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.577691 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.577718 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.681538 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.681606 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.681627 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.681654 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.681675 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.784872 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.784947 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.784971 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.785002 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.785022 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.888387 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.888456 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.888479 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.888506 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.888523 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.990995 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.991048 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.991066 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.991088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:35 crc kubenswrapper[4812]: I0131 04:27:35.991105 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:35Z","lastTransitionTime":"2026-01-31T04:27:35Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.094234 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.094293 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.094306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.094323 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.094335 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.197057 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.197099 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.197109 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.197125 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.197134 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.300229 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.300283 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.300299 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.300320 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.300334 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.313520 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-02 06:31:13.932638836 +0000 UTC Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.339273 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:36 crc kubenswrapper[4812]: E0131 04:27:36.339452 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.402431 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.402463 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.402471 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.402483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.402491 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.505460 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.505513 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.505524 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.505541 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.505553 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.607961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.608000 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.608008 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.608023 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.608032 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.710282 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.710341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.710350 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.710364 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.710372 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.813270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.813306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.813317 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.813331 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.813342 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.915769 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.916439 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.916587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.916717 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:36 crc kubenswrapper[4812]: I0131 04:27:36.916908 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:36Z","lastTransitionTime":"2026-01-31T04:27:36Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.019754 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.020123 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.020346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.020608 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.020798 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.124668 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.124965 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.125070 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.125162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.125261 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.227601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.227639 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.227648 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.227665 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.227676 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.314643 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-27 03:50:32.448525993 +0000 UTC Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.330059 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.330131 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.330148 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.330174 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.330195 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.338630 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.338676 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:37 crc kubenswrapper[4812]: E0131 04:27:37.338802 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.338819 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:37 crc kubenswrapper[4812]: E0131 04:27:37.339002 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:37 crc kubenswrapper[4812]: E0131 04:27:37.339117 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.432870 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.432929 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.432945 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.432977 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.432994 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.536438 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.536483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.536499 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.536520 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.536538 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.638435 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.638524 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.638542 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.638594 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.638613 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.742410 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.742450 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.742460 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.742474 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.742483 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.845448 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.845477 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.845486 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.845499 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.845508 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.949073 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.949127 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.949137 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.949151 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:37 crc kubenswrapper[4812]: I0131 04:27:37.949177 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:37Z","lastTransitionTime":"2026-01-31T04:27:37Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.052622 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.052930 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.052961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.052986 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.053003 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.156190 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.156266 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.156286 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.156309 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.156329 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.258306 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.258370 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.258394 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.258424 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.258442 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.315811 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 04:12:29.037522251 +0000 UTC Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.339292 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:38 crc kubenswrapper[4812]: E0131 04:27:38.339535 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.361303 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.361362 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.361383 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.361412 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.361434 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.463708 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.463774 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.463793 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.463815 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.463832 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.566314 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.566359 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.566370 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.566387 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.566400 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.668965 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.668998 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.669006 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.669019 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.669028 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.771940 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.771978 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.771989 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.772003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.772013 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.874493 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.874601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.874611 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.874626 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.874634 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.976981 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.977249 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.977346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.977463 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:38 crc kubenswrapper[4812]: I0131 04:27:38.977580 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:38Z","lastTransitionTime":"2026-01-31T04:27:38Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.080884 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.080922 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.080930 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.080946 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.080956 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.112879 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.112949 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.112971 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.112997 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.113016 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.131488 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:39Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.137168 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.137317 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.137393 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.137479 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.137583 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.155620 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:39Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.160155 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.160205 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.160230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.160247 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.160258 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.179907 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:39Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.184548 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.184583 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.184591 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.184604 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.184614 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.203446 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:39Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.208226 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.208251 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.208280 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.208293 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.208302 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.230358 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:39Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:39Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.230578 4812 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.232270 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.232314 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.232330 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.232353 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.232370 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.316611 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-15 18:34:03.222743712 +0000 UTC Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.334726 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.334772 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.334788 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.334809 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.334825 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.339351 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.339376 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.339462 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.339462 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.339576 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:39 crc kubenswrapper[4812]: E0131 04:27:39.339646 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.437606 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.437646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.437663 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.437685 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.437701 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.540640 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.540694 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.540753 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.540778 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.540830 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.643764 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.643819 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.643864 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.643888 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.643906 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.746487 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.746543 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.746556 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.746574 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.746587 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.849628 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.849687 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.849703 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.849726 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.849743 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.952034 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.952094 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.952112 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.952138 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:39 crc kubenswrapper[4812]: I0131 04:27:39.952156 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:39Z","lastTransitionTime":"2026-01-31T04:27:39Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.055207 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.055295 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.055318 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.055347 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.055365 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.157456 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.157511 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.157528 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.157553 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.157571 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.260085 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.260118 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.260128 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.260140 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.260149 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.316766 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-16 14:57:20.315623322 +0000 UTC Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.339442 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:40 crc kubenswrapper[4812]: E0131 04:27:40.339681 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.352931 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-crc"] Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.361873 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.361904 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.361914 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.361926 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.361935 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.464326 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.464371 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.464382 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.464398 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.464411 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.566740 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.566999 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.567041 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.567078 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.567098 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.668615 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.668656 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.668669 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.668684 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.668695 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.771070 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.771125 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.771144 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.771185 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.771208 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.873978 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.874025 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.874035 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.874047 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.874056 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.976421 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.976779 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.976966 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.977117 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:40 crc kubenswrapper[4812]: I0131 04:27:40.977256 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:40Z","lastTransitionTime":"2026-01-31T04:27:40Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.079700 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.080092 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.080267 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.080459 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.080599 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.183534 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.183639 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.183666 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.183693 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.183710 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.286372 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.286418 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.286429 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.286446 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.286458 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.317916 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 14:17:17.819594571 +0000 UTC Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.339311 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.339369 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.339311 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:41 crc kubenswrapper[4812]: E0131 04:27:41.339508 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:41 crc kubenswrapper[4812]: E0131 04:27:41.339641 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:41 crc kubenswrapper[4812]: E0131 04:27:41.339777 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.388678 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.388731 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.388748 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.388772 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.388790 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.491290 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.491343 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.491359 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.491383 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.491399 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.593881 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.593914 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.593923 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.593938 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.593948 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.696590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.696632 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.696642 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.696659 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.696671 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.798440 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.798483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.798501 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.798524 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.798540 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.900708 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.900771 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.900795 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.900821 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:41 crc kubenswrapper[4812]: I0131 04:27:41.900916 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:41Z","lastTransitionTime":"2026-01-31T04:27:41Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.002785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.002820 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.002830 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.002858 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.002869 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.105441 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.105501 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.105512 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.105529 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.105541 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.207608 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.207727 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.207743 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.207765 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.207781 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.310407 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.310455 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.310471 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.310492 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.310507 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.318584 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-01 14:37:50.724647493 +0000 UTC Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.339122 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:42 crc kubenswrapper[4812]: E0131 04:27:42.339273 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.356307 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.375939 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.385322 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.397450 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.413369 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.413310 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.413404 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.413414 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.413428 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.413438 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.425531 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.440238 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.453245 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.466809 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.483275 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.495106 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.506119 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.515699 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.516218 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.516282 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.516303 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.516333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.516354 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.527971 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.539682 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.549417 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.560752 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.571051 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.580228 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:42Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.606065 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:42 crc kubenswrapper[4812]: E0131 04:27:42.606242 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:42 crc kubenswrapper[4812]: E0131 04:27:42.606323 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:28:14.606304999 +0000 UTC m=+103.101326664 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.619013 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.619059 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.619076 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.619100 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.619116 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.721585 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.721621 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.721632 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.721646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.721657 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.825498 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.826016 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.826088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.826621 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.826684 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.929038 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.929112 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.929134 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.929162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:42 crc kubenswrapper[4812]: I0131 04:27:42.929185 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:42Z","lastTransitionTime":"2026-01-31T04:27:42Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.031640 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.031672 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.031682 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.031711 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.031721 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.134224 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.134273 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.134284 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.134302 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.134313 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.236478 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.236546 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.236564 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.236588 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.236605 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.319529 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-28 15:19:41.719030775 +0000 UTC Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.365422 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.365487 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.365422 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:43 crc kubenswrapper[4812]: E0131 04:27:43.365632 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:43 crc kubenswrapper[4812]: E0131 04:27:43.365757 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:43 crc kubenswrapper[4812]: E0131 04:27:43.365873 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.367077 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.367132 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.367149 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.367169 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.367185 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.469617 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.469664 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.469675 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.469694 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.469706 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.572582 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.572646 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.572663 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.572689 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.572707 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.675369 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.675434 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.675456 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.675535 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.675557 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.765366 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/0.log" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.765415 4812 generic.go:334] "Generic (PLEG): container finished" podID="6050f642-2492-4f83-a739-ac905c409b8c" containerID="d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049" exitCode=1 Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.765444 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerDied","Data":"d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.765790 4812 scope.go:117] "RemoveContainer" containerID="d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.777785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.777822 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.777831 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.777861 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.777871 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.801324 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.818453 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.831115 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.858932 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.874920 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"containers with unready status: [kube-multus]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"2026-01-31T04:26:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18\\\\n2026-01-31T04:26:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18 to /host/opt/cni/bin/\\\\n2026-01-31T04:26:57Z [verbose] multus-daemon started\\\\n2026-01-31T04:26:57Z [verbose] Readiness Indicator file check\\\\n2026-01-31T04:27:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.880225 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.880260 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.880272 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.880288 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.880300 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.895457 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.908708 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.926260 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.941225 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.952436 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.967018 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.980216 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.983075 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.983113 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.983126 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.983144 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.983156 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:43Z","lastTransitionTime":"2026-01-31T04:27:43Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:43 crc kubenswrapper[4812]: I0131 04:27:43.998989 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:43Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.016726 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.025906 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.036804 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.046477 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.059000 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.071247 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.086820 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.086866 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.086881 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.086898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.086909 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.189915 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.189998 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.190021 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.190053 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.190078 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.293385 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.293444 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.293459 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.293483 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.293501 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.320704 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-13 17:51:18.348051313 +0000 UTC Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.339426 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:44 crc kubenswrapper[4812]: E0131 04:27:44.339601 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.396738 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.396815 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.396832 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.396891 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.396909 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.499567 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.499593 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.499601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.499614 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.499624 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.602905 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.603003 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.603025 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.603055 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.603079 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.706162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.706212 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.706231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.706253 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.706270 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.771663 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/0.log" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.771748 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerStarted","Data":"4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.786419 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.809236 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.809304 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.809317 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.809337 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.809349 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.812540 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":2,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 20s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.844819 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.859135 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.877451 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.895869 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.910620 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"2026-01-31T04:26:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18\\\\n2026-01-31T04:26:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18 to /host/opt/cni/bin/\\\\n2026-01-31T04:26:57Z [verbose] multus-daemon started\\\\n2026-01-31T04:26:57Z [verbose] Readiness Indicator file check\\\\n2026-01-31T04:27:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.911199 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.911258 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.911276 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.911300 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.911317 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:44Z","lastTransitionTime":"2026-01-31T04:27:44Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.930770 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.944310 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.963438 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.975980 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:44 crc kubenswrapper[4812]: I0131 04:27:44.991793 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:44Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.007477 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.014430 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.014471 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.014489 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.014512 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.014528 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.021955 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.041355 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.057201 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.073537 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.090615 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.109734 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:45Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.117966 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.118025 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.118046 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.118071 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.118089 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.220428 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.220489 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.220505 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.220524 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.220537 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.321599 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 17:14:01.313080796 +0000 UTC Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.322975 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.323034 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.323051 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.323077 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.323101 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.339525 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.339579 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.339625 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:45 crc kubenswrapper[4812]: E0131 04:27:45.339816 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:45 crc kubenswrapper[4812]: E0131 04:27:45.340005 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:45 crc kubenswrapper[4812]: E0131 04:27:45.340173 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.426341 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.426413 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.426426 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.426445 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.426460 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.528941 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.528992 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.529007 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.529028 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.529043 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.631567 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.631603 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.631615 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.631631 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.631643 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.734249 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.734305 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.734322 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.734346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.734362 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.837336 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.837390 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.837408 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.837431 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.837481 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.939351 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.939404 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.939416 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.939434 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:45 crc kubenswrapper[4812]: I0131 04:27:45.939447 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:45Z","lastTransitionTime":"2026-01-31T04:27:45Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.041806 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.041870 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.041883 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.041898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.041907 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.145046 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.145155 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.145220 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.145249 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.145268 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.248184 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.248245 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.248262 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.248287 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.248310 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.322506 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-14 02:55:25.099508701 +0000 UTC Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.338933 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:46 crc kubenswrapper[4812]: E0131 04:27:46.339123 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.355891 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.355958 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.355977 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.356432 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.356484 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.460042 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.460111 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.460131 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.460156 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.460174 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.563448 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.563519 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.563539 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.563620 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.563666 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.666722 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.666769 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.666780 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.666799 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.666812 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.770502 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.770551 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.770566 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.770587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.770602 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.872795 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.872882 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.872901 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.872920 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.872933 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.988402 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.988444 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.988456 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.988474 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:46 crc kubenswrapper[4812]: I0131 04:27:46.988486 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:46Z","lastTransitionTime":"2026-01-31T04:27:46Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.091103 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.091134 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.091145 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.091159 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.091172 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.193829 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.193961 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.193982 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.194005 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.194022 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.296129 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.296190 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.296209 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.296232 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.296248 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.323719 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-16 19:58:02.394754537 +0000 UTC Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.339149 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:47 crc kubenswrapper[4812]: E0131 04:27:47.339305 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.340057 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:47 crc kubenswrapper[4812]: E0131 04:27:47.340215 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.340451 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:47 crc kubenswrapper[4812]: E0131 04:27:47.340528 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.341072 4812 scope.go:117] "RemoveContainer" containerID="c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.398726 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.398798 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.398825 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.398889 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.398912 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.502170 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.502215 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.502227 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.502248 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.502265 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.604098 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.604151 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.604162 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.604182 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.604194 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.705796 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.705856 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.705867 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.705883 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.705909 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.783588 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/2.log" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.786665 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.787167 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.798931 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.807587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.807622 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.807634 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.807650 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.807661 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.815901 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.830441 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.848790 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.864646 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"2026-01-31T04:26:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18\\\\n2026-01-31T04:26:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18 to /host/opt/cni/bin/\\\\n2026-01-31T04:26:57Z [verbose] multus-daemon started\\\\n2026-01-31T04:26:57Z [verbose] Readiness Indicator file check\\\\n2026-01-31T04:27:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.885989 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.901077 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.909513 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.909535 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.909543 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.909556 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.909566 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:47Z","lastTransitionTime":"2026-01-31T04:27:47Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.915455 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.924582 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.935424 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.945248 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.954054 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.971701 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.982403 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:47 crc kubenswrapper[4812]: I0131 04:27:47.992452 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:47Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.011171 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.011195 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.011204 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.011216 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.011225 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.021658 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.039430 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.052824 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.086640 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.114207 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.114259 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.114291 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.114313 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.114328 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.216372 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.216415 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.216429 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.216444 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.216456 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.318612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.318655 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.318664 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.318678 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.318686 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.324798 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-21 23:36:16.971037544 +0000 UTC Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.339215 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:48 crc kubenswrapper[4812]: E0131 04:27:48.339430 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.425802 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.425894 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.425917 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.425947 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.425969 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.528756 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.528805 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.528823 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.528881 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.528900 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.631393 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.631601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.631641 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.631672 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.631692 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.734645 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.734714 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.734737 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.734766 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.734795 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.794538 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/3.log" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.795661 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/2.log" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.800768 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" exitCode=1 Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.800836 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.800921 4812 scope.go:117] "RemoveContainer" containerID="c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.801916 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:27:48 crc kubenswrapper[4812]: E0131 04:27:48.802169 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.818978 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.831587 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.836920 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.836944 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.836952 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.836964 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.836973 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.841827 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.860203 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.878319 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.899622 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.920660 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.940582 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.940641 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.940663 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.940688 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.940708 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:48Z","lastTransitionTime":"2026-01-31T04:27:48Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.943092 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.960689 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.978950 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:48 crc kubenswrapper[4812]: I0131 04:27:48.995256 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:48Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.030425 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://c7065e8087f7a80aff9d9b10a13a7ea479a21a8f1ca725e053c400c9bc142135\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:20Z\\\",\\\"message\\\":\\\"1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.708725 6448 handler.go:208] Removed *v1.Pod event handler 6\\\\nI0131 04:27:20.708732 6448 handler.go:208] Removed *v1.EgressFirewall event handler 9\\\\nI0131 04:27:20.708739 6448 handler.go:208] Removed *v1.Pod event handler 3\\\\nI0131 04:27:20.709593 6448 reflector.go:311] Stopping reflector *v1alpha1.AdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.709765 6448 reflector.go:311] Stopping reflector *v1alpha1.BaselineAdminNetworkPolicy (0s) from sigs.k8s.io/network-policy-api/pkg/client/informers/externalversions/factory.go:141\\\\nI0131 04:27:20.710233 6448 reflector.go:311] Stopping reflector *v1.EgressQoS (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/egressqos/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710569 6448 reflector.go:311] Stopping reflector *v1.AdminPolicyBasedExternalRoute (0s) from github.com/openshift/ovn-kubernetes/go-controller/pkg/crd/adminpolicybasedroute/v1/apis/informers/externalversions/factory.go:140\\\\nI0131 04:27:20.710920 6448 handler.go:190] Sending *v1.EgressIP event handler 8 for removal\\\\nI0131 04:27:20.710977 6448 factory.go:656] Stopping watch factory\\\\nI0131 04:27:20.710999 6448 ovnkube.go:599] Stopped ovnkube\\\\nI0131 0\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:19Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:48Z\\\",\\\"message\\\":\\\"org/owner\\\\\\\":\\\\\\\"openshift-network-diagnostics/network-check-target\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 04:27:48.369606 6837 ovnkube.go:599] Stopped ovnkube\\\\nI0131 04:27:48.369641 6837 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 04:27:48.369732 6837 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:47Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.044012 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.044241 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.044393 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.044538 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.044706 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.069664 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.093096 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.115460 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.135466 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.147714 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.147954 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.147988 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.148012 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.148041 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.152510 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"2026-01-31T04:26:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18\\\\n2026-01-31T04:26:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18 to /host/opt/cni/bin/\\\\n2026-01-31T04:26:57Z [verbose] multus-daemon started\\\\n2026-01-31T04:26:57Z [verbose] Readiness Indicator file check\\\\n2026-01-31T04:27:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.182340 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.201986 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.250390 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.250702 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.251057 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.251283 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.251467 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.325271 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-08 10:28:58.790258008 +0000 UTC Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.338771 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.338879 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.339236 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.339023 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.338896 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.339382 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.354523 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.354566 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.354578 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.354595 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.354607 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.457711 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.457769 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.457785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.457807 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.457823 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.492328 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.492397 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.492415 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.492437 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.492455 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.533348 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.540632 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.540685 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.540696 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.540717 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.540730 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.562559 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.569746 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.569788 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.569801 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.569820 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.569831 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.586987 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.591785 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.591921 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.591950 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.591977 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.591995 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.615349 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.619042 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.619078 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.619088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.619102 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.619111 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.633931 4812 kubelet_node_status.go:585] "Error updating node status, will retry" err="failed to patch status \"{\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"type\\\":\\\"DiskPressure\\\"},{\\\"type\\\":\\\"PIDPressure\\\"},{\\\"type\\\":\\\"Ready\\\"}],\\\"allocatable\\\":{\\\"cpu\\\":\\\"11800m\\\",\\\"ephemeral-storage\\\":\\\"76396645454\\\",\\\"memory\\\":\\\"32404556Ki\\\"},\\\"capacity\\\":{\\\"cpu\\\":\\\"12\\\",\\\"ephemeral-storage\\\":\\\"83293888Ki\\\",\\\"memory\\\":\\\"32865356Ki\\\"},\\\"conditions\\\":[{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient memory available\\\",\\\"reason\\\":\\\"KubeletHasSufficientMemory\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"MemoryPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has no disk pressure\\\",\\\"reason\\\":\\\"KubeletHasNoDiskPressure\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"DiskPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"kubelet has sufficient PID available\\\",\\\"reason\\\":\\\"KubeletHasSufficientPID\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PIDPressure\\\"},{\\\"lastHeartbeatTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:49Z\\\",\\\"message\\\":\\\"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\\\",\\\"reason\\\":\\\"KubeletNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"}],\\\"images\\\":[{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b9ea248f8ca33258fe1683da51d2b16b94630be1b361c65f68a16c1a34b94887\\\"],\\\"sizeBytes\\\":2887430265},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:4a62fa1c0091f6d94e8fb7258470b9a532d78364b6b51a05341592041d598562\\\",\\\"registry.redhat.io/redhat/redhat-operator-index@sha256:8db792bab418e30d9b71b9e1ac330ad036025257abbd2cd32f318ed14f70d6ac\\\",\\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1523204510},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\"],\\\"sizeBytes\\\":1498102846},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\"],\\\"sizeBytes\\\":1232839934},{\\\"names\\\":[\\\"registry.redhat.io/redhat/community-operator-index@sha256:8ff55cdb2367f5011074d2f5ebdc153b8885e7495e14ae00f99d2b7ab3584ade\\\",\\\"registry.redhat.io/redhat/community-operator-index@sha256:d656c1453f2261d9b800f5c69fba3bc2ffdb388414c4c0e89fcbaa067d7614c4\\\",\\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1151049424},{\\\"names\\\":[\\\"registry.redhat.io/redhat/certified-operator-index@sha256:1d7d4739b2001bd173f2632d5f73724a5034237ee2d93a02a21bbfff547002ba\\\",\\\"registry.redhat.io/redhat/certified-operator-index@sha256:7688bce5eb0d153adff87fc9f7a47642465c0b88208efb236880197969931b37\\\",\\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"],\\\"sizeBytes\\\":1032059094},{\\\"names\\\":[\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:0878ac12c537fcfc617a539b3b8bd329ba568bb49c6e3bb47827b177c47ae669\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index@sha256:1dc15c170ebf462dacaef75511740ed94ca1da210f3980f66d77f91ba201c875\\\",\\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"],\\\"sizeBytes\\\":1001152198},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\"],\\\"sizeBytes\\\":964552795},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\"],\\\"sizeBytes\\\":947616130},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c3cc3840d7a81ce1b420f06e07a923861faf37d9c10688aa3aa0b7b76c8706ad\\\"],\\\"sizeBytes\\\":907837715},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:101f295e2eae0755ae1865f7de885db1f17b9368e4120a713bb5f79e17ce8f93\\\"],\\\"sizeBytes\\\":854694423},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:47b0670fa1051335fd2d2c9e8361e4ed77c7760c33a2180b136f7c7f59863ec2\\\"],\\\"sizeBytes\\\":852490370},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:862f4a4bed52f372056b6d368e2498ebfb063075b31cf48dbdaaeedfcf0396cb\\\"],\\\"sizeBytes\\\":772592048},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\"],\\\"sizeBytes\\\":705793115},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\"],\\\"sizeBytes\\\":687915987},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f247257b0885cf5d303e3612c7714b33ae51404cfa2429822060c6c025eb17dd\\\"],\\\"sizeBytes\\\":668060419},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\"],\\\"sizeBytes\\\":613826183},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e3e9dc0b02b9351edf7c46b1d46d724abd1ac38ecbd6bc541cee84a209258d8\\\"],\\\"sizeBytes\\\":581863411},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\"],\\\"sizeBytes\\\":574606365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ee8d8f089ec1488067444c7e276c4e47cc93840280f3b3295484d67af2232002\\\"],\\\"sizeBytes\\\":550676059},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:10f20a39f16ae3019c62261eda8beb9e4d8c36cbb7b500b3bae1312987f0685d\\\"],\\\"sizeBytes\\\":541458174},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\"],\\\"sizeBytes\\\":533092226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\"],\\\"sizeBytes\\\":528023732},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\"],\\\"sizeBytes\\\":510867594},{\\\"names\\\":[\\\"quay.io/crcont/ocp-release@sha256:0b6ae0d091d2bf49f9b3a3aff54aabdc49e70c783780f118789f49d8f95a9e03\\\"],\\\"sizeBytes\\\":510526836},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\"],\\\"sizeBytes\\\":507459597},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7e9e7dd2b1a8394b7490ca6df8a3ee8cdfc6193ecc6fb6173ed9a1868116a207\\\"],\\\"sizeBytes\\\":505721947},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:094bb6a6641b4edbaf932f0551bcda20b0d4e012cbe84207348b24eeabd351e9\\\"],\\\"sizeBytes\\\":504778226},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c69fe7a98a744b7a7b61b2a8db81a338f373cd2b1d46c6d3f02864b30c37e46c\\\"],\\\"sizeBytes\\\":504735878},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e51e6f78ec20ef91c82e94a49f950e427e77894e582dcc406eec4df807ddd76e\\\"],\\\"sizeBytes\\\":502943148},{\\\"names\\\":[\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\"],\\\"sizeBytes\\\":501379880},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:3a741253807c962189819d879b8fef94a9452fb3f5f3969ec3207eb2d9862205\\\"],\\\"sizeBytes\\\":500472212},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\"],\\\"sizeBytes\\\":498888951},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5aa9e5379bfeb63f4e517fb45168eb6820138041641bbdfc6f4db6427032fa37\\\"],\\\"sizeBytes\\\":497832828},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\"],\\\"sizeBytes\\\":497742284},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:88b1f0a05a1b1c91e1212b40f0e7d04c9351ec9d34c52097bfdc5897b46f2f0e\\\"],\\\"sizeBytes\\\":497120598},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:737e9019a072c74321e0a909ca95481f5c545044dd4f151a34d0e1c8b9cf273f\\\"],\\\"sizeBytes\\\":488494681},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:fe009d03910e18795e3bd60a3fd84938311d464d2730a2af5ded5b24e4d05a6b\\\"],\\\"sizeBytes\\\":487097366},{\\\"names\\\":[\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:66760a53b64d381940757ca9f0d05f523a61f943f8da03ce9791e5d05264a736\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner@sha256:e97a0cb5b6119a9735efe0ac24630a8912fcad89a1dddfa76dc10edac4ec9815\\\",\\\"registry.redhat.io/openshift4/ose-csi-external-provisioner:latest\\\"],\\\"sizeBytes\\\":485998616},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\"],\\\"sizeBytes\\\":485767738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:898cae57123c5006d397b24af21b0f24a0c42c9b0be5ee8251e1824711f65820\\\"],\\\"sizeBytes\\\":485535312},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:1eda5ad6a6c5b9cd94b4b456e9116f4a0517241b614de1a99df14baee20c3e6a\\\"],\\\"sizeBytes\\\":479585218},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:487c0a8d5200bcdce484ab1169229d8fcb8e91a934be45afff7819c4f7612f57\\\"],\\\"sizeBytes\\\":476681373},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:b641ed0d63034b23d07eb0b2cd455390e83b186e77375e2d3f37633c1ddb0495\\\"],\\\"sizeBytes\\\":473958144},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:32f9e10dfb8a7c812ea8b3e71a42bed9cef05305be18cc368b666df4643ba717\\\"],\\\"sizeBytes\\\":463179365},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:8fdf28927b06a42ea8af3985d558c84d9efd142bb32d3892c4fa9f5e0d98133c\\\"],\\\"sizeBytes\\\":460774792},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dd0628f89ad843d82d5abfdc543ffab6a861a23cc3005909bd88fa7383b71113\\\"],\\\"sizeBytes\\\":459737917},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\"],\\\"sizeBytes\\\":457588564},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:adabc3456bf4f799f893d792cdf9e8cbc735b070be346552bcc99f741b0a83aa\\\"],\\\"sizeBytes\\\":450637738},{\\\"names\\\":[\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:342dca43b5b09123737ccda5e41b4a5d564e54333d8ce04d867d3fb968600317\\\"],\\\"sizeBytes\\\":448887027}],\\\"nodeInfo\\\":{\\\"bootID\\\":\\\"3069a142-20b2-4287-9a2d-d92558a419a1\\\",\\\"systemUUID\\\":\\\"9730f4f2-835d-4e9b-a74d-461488f96726\\\"}}}\" for node \"crc\": Internal error occurred: failed calling webhook \"node.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/node?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.634035 4812 kubelet_node_status.go:572] "Unable to update node status" err="update node status exceeds retry count" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.635089 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.635143 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.635160 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.635181 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.635198 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.738134 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.738171 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.738182 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.738198 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.738207 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.805964 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/3.log" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.810096 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:27:49 crc kubenswrapper[4812]: E0131 04:27:49.810227 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.828870 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.841476 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.841548 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.841568 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.841592 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.841611 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.847387 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.861173 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.876200 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.892040 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.905797 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.921029 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.937109 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.944528 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.944571 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.944590 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.944612 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.944629 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:49Z","lastTransitionTime":"2026-01-31T04:27:49Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.953430 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:49 crc kubenswrapper[4812]: I0131 04:27:49.986414 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:49Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.004500 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.017712 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.044625 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:48Z\\\",\\\"message\\\":\\\"org/owner\\\\\\\":\\\\\\\"openshift-network-diagnostics/network-check-target\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 04:27:48.369606 6837 ovnkube.go:599] Stopped ovnkube\\\\nI0131 04:27:48.369641 6837 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 04:27:48.369732 6837 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.047513 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.047558 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.047569 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.047584 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.047593 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.059546 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.077851 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.092746 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.107234 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.125560 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"2026-01-31T04:26:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18\\\\n2026-01-31T04:26:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18 to /host/opt/cni/bin/\\\\n2026-01-31T04:26:57Z [verbose] multus-daemon started\\\\n2026-01-31T04:26:57Z [verbose] Readiness Indicator file check\\\\n2026-01-31T04:27:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.147592 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:50Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.149828 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.149919 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.149937 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.149960 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.149978 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.252522 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.252581 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.252597 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.252620 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.252637 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.326582 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-29 11:14:46.4642468 +0000 UTC Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.339235 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:50 crc kubenswrapper[4812]: E0131 04:27:50.339568 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.355147 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.355205 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.355224 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.355250 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.355266 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.458233 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.458985 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.459029 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.459058 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.459077 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.561823 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.561939 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.561990 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.562014 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.562030 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.665029 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.665082 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.665100 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.665123 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.665140 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.768131 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.768213 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.768236 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.768265 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.768289 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.870401 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.870472 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.870496 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.870527 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.870553 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.973914 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.973969 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.973992 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.974020 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:50 crc kubenswrapper[4812]: I0131 04:27:50.974042 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:50Z","lastTransitionTime":"2026-01-31T04:27:50Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.077213 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.077275 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.077292 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.077316 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.077334 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.187737 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.187778 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.187792 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.187808 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.187822 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.290373 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.290447 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.290470 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.290501 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.290523 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.327207 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-06 06:18:03.67038467 +0000 UTC Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.338646 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:51 crc kubenswrapper[4812]: E0131 04:27:51.338789 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.338909 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:51 crc kubenswrapper[4812]: E0131 04:27:51.338991 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.339101 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:51 crc kubenswrapper[4812]: E0131 04:27:51.339285 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.393469 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.393537 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.393562 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.393591 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.393613 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.496947 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.497018 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.497035 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.497060 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.497076 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.599915 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.599975 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.599994 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.600022 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.600044 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.703311 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.703369 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.703388 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.703411 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.703429 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.805512 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.805575 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.805599 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.805629 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.805651 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.908727 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.908780 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.908796 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.908871 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:51 crc kubenswrapper[4812]: I0131 04:27:51.908891 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:51Z","lastTransitionTime":"2026-01-31T04:27:51Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.012419 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.012534 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.012561 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.012587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.012607 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.115758 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.115820 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.115871 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.115902 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.115926 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.218632 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.218710 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.218816 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.218875 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.218895 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.321354 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.321422 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.321454 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.321490 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.321513 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.327744 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-11 21:18:54.699720199 +0000 UTC Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.339183 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:52 crc kubenswrapper[4812]: E0131 04:27:52.339337 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.373570 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-etcd/etcd-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"7a85631b-1538-4ee9-a5b1-58fd701159c6\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:36Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:54Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://b56d703e6d45f15f5a25f0da7210b69d9e8a37a02e13796621a5aef8e8c17b76\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://31a0b52ee823d155bc7d9e1fb7b217040fadcf102657bb86ff8d02234509141d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://90df94a27832e9a4e21fc268db4f7464f45bedfbee9c8d47d5699fd399b4d443\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd/\\\",\\\"name\\\":\\\"log-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://e35c86c87a4e442ad020c00806c4539d8f8786261d4d363f6234061abff7528f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:a0fa3723269019bee1847b26702f42928e779036cc2f58408f8ee7866be30a93\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-rev\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:37Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/lib/etcd\\\",\\\"name\\\":\\\"data-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://81a6d3678cda67971aec780450ba1e2f83e44e5ff8b1e059fea3d3da184d413a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcdctl\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:36Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/manifests\\\",\\\"name\\\":\\\"static-pod-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/lib/etcd/\\\",\\\"name\\\":\\\"data-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a5d3fb1066ff9be9e1bccd31673877d3c465f63aca6f0fa24ac35e52279485cc\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/etcd\\\",\\\"name\\\":\\\"log-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-ensure-env-vars\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f48aa5c092a0de124cc2fce3b5bc263be35c28145d7142bc209b11c21aaa1bfa\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}}},{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:07b7c6877441ecd6a5646fb68e33e9be8b90092272e49117b54b4a67314731ca\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"etcd-resources-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://2f4815082a4bf6d3902639ff5a30d95d234b4e7420f9ff59da973b40e9b47ff8\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/usr/local/bin\\\",\\\"name\\\":\\\"usr-local-bin\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-etcd\"/\"etcd-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.391806 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/iptables-alerter-4ln5h" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d75a4c96-2883-4a0b-bab2-0fab2b6c0b49\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d4037168fca327551a5293f6b4a9c70abe6b4faa340faeae9083a748e0faa53a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"iptables-alerter\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:55Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/iptables-alerter\\\",\\\"name\\\":\\\"iptables-alerter-script\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rczfb\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"iptables-alerter-4ln5h\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.408339 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-image-registry/node-ca-kctmd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"e2b2af11-2df5-49c5-92e2-3965de954bb2\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://d3416610cb64990b67041193cd872aea03a09b621ea69fef7cfafe0b56aaad41\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:9fa29d188c85a8b1e1bd15c9c18e96f1b235da9bd4a45dbc086a4a69520ed63f\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"node-ca\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/tmp/serviceca\\\",\\\"name\\\":\\\"serviceca\\\"},{\\\"mountPath\\\":\\\"/etc/docker/certs.d\\\",\\\"name\\\":\\\"host\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fx7l4\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-image-registry\"/\"node-ca-kctmd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.424693 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.424743 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.424760 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.424782 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.424798 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.441068 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"message\\\":\\\"containers with unready status: [ovnkube-controller]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-node\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-ovn-metrics\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-node-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"nbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"northd\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-acl-logging\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovn-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn/\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/dev/log\\\",\\\"name\\\":\\\"log-socket\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:48Z\\\",\\\"message\\\":\\\"org/owner\\\\\\\":\\\\\\\"openshift-network-diagnostics/network-check-target\\\\\\\"}, Opts:services.LBOpts{Reject:true, EmptyLBEvents:false, AffinityTimeOut:0, SkipSNAT:false, Template:false, AddressFamily:\\\\\\\"\\\\\\\"}, Rules:[]services.LBRule{services.LBRule{Source:services.Addr{IP:\\\\\\\"10.217.5.219\\\\\\\", Port:80, Template:(*services.Template)(nil)}, Targets:[]services.Addr{}}}, Templates:services.TemplateMap(nil), Switches:[]string{}, Routers:[]string{}, Groups:[]string{\\\\\\\"clusterLBGroup\\\\\\\"}}}\\\\nI0131 04:27:48.369606 6837 ovnkube.go:599] Stopped ovnkube\\\\nI0131 04:27:48.369641 6837 metrics.go:553] Stopping metrics server at address \\\\\\\"127.0.0.1:29103\\\\\\\"\\\\nF0131 04:27:48.369732 6837 ovnkube.go:137] failed to run ovnkube: [failed to start network controller: failed to start default network controller: unable to create admin network policy controller, err: could not add Event Handler for anpInformer during admin network policy controller initialization, handler {0x1fcc6e0 0x1fcc3c0 0x1fcc360} was not added to shared informer because it has stopped already, failed to start node network controller: failed to start default node network controller: failed to set node crc annotations: Internal error occurred: failed calling webhook \\\\\\\"node.network-node-identity.openshift.io\\\\\\\": failed to call webhook: Post \\\\\\\"https://127.0.0.1:9743/node?timeout=10s\\\\\\\": tls: failed to verify certificate: x509: certi\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:47Z\\\"}},\\\"name\\\":\\\"ovnkube-controller\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"message\\\":\\\"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\\\",\\\"reason\\\":\\\"CrashLoopBackOff\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-kubelet\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/systemd/system\\\",\\\"name\\\":\\\"systemd-units\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host\\\",\\\"name\\\":\\\"host-slash\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/ovn-kubernetes/\\\",\\\"name\\\":\\\"host-run-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/run/systemd/private\\\",\\\"name\\\":\\\"run-systemd\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/cni-bin-dir\\\",\\\"name\\\":\\\"host-cni-bin\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d\\\",\\\"name\\\":\\\"host-cni-netd\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/networks/ovn-k8s-cni-overlay\\\",\\\"name\\\":\\\"host-var-lib-cni-networks-ovn-kubernetes\\\"},{\\\"mountPath\\\":\\\"/run/openvswitch\\\",\\\"name\\\":\\\"run-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/log/ovnkube/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/etc/openvswitch\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/lib/openvswitch\\\",\\\"name\\\":\\\"var-lib-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"sbdb\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/ovnkube-lib\\\",\\\"name\\\":\\\"ovnkube-script-lib\\\"},{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/run/ovn/\\\",\\\"name\\\":\\\"run-ovn\\\"},{\\\"mountPath\\\":\\\"/var/log/ovn\\\",\\\"name\\\":\\\"node-log\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kubecfg-setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/ovn/\\\",\\\"name\\\":\\\"etc-openvswitch\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cvm2v\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:56Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-node-bl2f9\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.462951 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-pnwcx" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6050f642-2492-4f83-a739-ac905c409b8c\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:44Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049\\\",\\\"exitCode\\\":1,\\\"finishedAt\\\":\\\"2026-01-31T04:27:43Z\\\",\\\"message\\\":\\\"2026-01-31T04:26:57+00:00 [cnibincopy] Successfully copied files in /usr/src/multus-cni/rhel9/bin/ to /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18\\\\n2026-01-31T04:26:57+00:00 [cnibincopy] Successfully moved files in /host/opt/cni/bin/upgrade_399cd47e-be84-4536-a093-a4b847f7ca18 to /host/opt/cni/bin/\\\\n2026-01-31T04:26:57Z [verbose] multus-daemon started\\\\n2026-01-31T04:26:57Z [verbose] Readiness Indicator file check\\\\n2026-01-31T04:27:42Z [error] have you checked that your default network is ready? still waiting for readinessindicatorfile @ /host/run/multus/cni/net.d/10-ovn-kubernetes.conf. pollimmediate error: timed out waiting for the condition\\\\n\\\",\\\"reason\\\":\\\"Error\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"name\\\":\\\"kube-multus\\\",\\\"ready\\\":true,\\\"restartCount\\\":1,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:43Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/run/multus/cni/net.d\\\",\\\"name\\\":\\\"multus-cni-dir\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/run/multus\\\",\\\"name\\\":\\\"multus-socket-dir-parent\\\"},{\\\"mountPath\\\":\\\"/run/k8s.cni.cncf.io\\\",\\\"name\\\":\\\"host-run-k8s-cni-cncf-io\\\"},{\\\"mountPath\\\":\\\"/run/netns\\\",\\\"name\\\":\\\"host-run-netns\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/bin\\\",\\\"name\\\":\\\"host-var-lib-cni-bin\\\"},{\\\"mountPath\\\":\\\"/var/lib/cni/multus\\\",\\\"name\\\":\\\"host-var-lib-cni-multus\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"host-var-lib-kubelet\\\"},{\\\"mountPath\\\":\\\"/hostroot\\\",\\\"name\\\":\\\"hostroot\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/net.d\\\",\\\"name\\\":\\\"multus-conf-dir\\\"},{\\\"mountPath\\\":\\\"/etc/cni/net.d/multus.d\\\",\\\"name\\\":\\\"multus-daemon-config\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/cni/multus/certs\\\",\\\"name\\\":\\\"host-run-multus-certs\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kubernetes\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-96b4n\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-pnwcx\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.491266 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"258de1b0-7f55-45cb-9ce9-57366ae91c94\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:02Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:03Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://df403bfb5cdf1dfd17d5e32086c100e6b9f3d990dc078aa07d50c49d9110cbe5\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:7eeaee65f2808b819eedb413bdcabb9144e12f0dd97f13fd1afba93a95b67b26\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-multus-additional-cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:02Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e40792096b162f0f9ce5f8362f51e5f8dea2c1ce4b1447235388416b5db7708c\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"egress-router-binary-copy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ea028cd09cb5db754876654ded54142c6505b9259fd758464b3bb1e1f91c5c82\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:687fddfbb085a1688df312ce4ec8c857df9b2daed8ff4a7ed6163a1154afa2cc\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cni-plugins\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://9fc98d2129a4aa2dab69f5fb8100bfa00ff5fd12fe46ec5d12ed4870becdb62b\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:57Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/tuning/\\\",\\\"name\\\":\\\"tuning-conf-dir\\\"},{\\\"mountPath\\\":\\\"/sysctls\\\",\\\"name\\\":\\\"cni-sysctl-allowlist\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:383f4cceeeaead203bb2327fdd367c64b64d729d7fa93089f249e496fcef0c78\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"bond-cni-plugin\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://7d74994a3a5d88e49d752b1646e32b282621732bfc7d45362c3ff540aa7dee28\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:58Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:58Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:f567acb85146b5ed81451ec3e79f2de0c62e28c69b2eeade0abdf5d0c388e7aa\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"routeoverride-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://a49ad7313b074fa54d746d452566f751fcd78dc2dfca6ee6417ce98428ece0af\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:59Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:59Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni-bincopy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://8e7024abecc39e32457478357884cd94dc92d31481104fd4f595c34fca84ea35\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:00Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/entrypoint\\\",\\\"name\\\":\\\"cni-binary-copy\\\"},{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/os-release\\\",\\\"name\\\":\\\"os-release\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:98100674616e54319f6713d742fd0c3bdbc84e6e6173e8ccf4a2473a714c2bc4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"whereabouts-cni\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://f29454fd0b01350df8bafaeaae0dc41d83e41a866932685d9da9732f15e03152\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:27:01Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:27:01Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/host/opt/cni/bin\\\",\\\"name\\\":\\\"cnibin\\\"},{\\\"mountPath\\\":\\\"/host/etc/cni/net.d\\\",\\\"name\\\":\\\"system-cni-dir\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-5vp29\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-multus\"/\"multus-additional-cni-plugins-2vzj6\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.509191 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"f58d8fb1-d496-4d7e-9c14-a91b656355be\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://bd4ee5838c803a816c4a6f068b8c979d83eb6bb8d249751877af4d9b967ec35d\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy-crio\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"etc-kube\\\"},{\\\"mountPath\\\":\\\"/var/lib/kubelet\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://ab15730cf1a9889a2af529e5cf21234c96112d64a8c2a8c54c7578e9070b724f\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var\\\",\\\"name\\\":\\\"var-lib-kubelet\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"kube-rbac-proxy-crio-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.529626 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.529671 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.529688 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.529712 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.529730 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.529794 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"6cb4c886-f070-4393-9d9f-9bf9878fcac2\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:51Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"},{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-regeneration-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-check-endpoints\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-apiserver-insecure-readyz\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}}}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"setup\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/log/kube-apiserver\\\",\\\"name\\\":\\\"audit-dir\\\"}]}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-apiserver\"/\"kube-apiserver-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.548896 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9d751cbb-f2e2-430d-9754-c882a5e924a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [check-endpoints]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"check-endpoints\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2dwl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-source-55646444c4-trplf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.563163 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [networking-console-plugin]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:ae647598ec35cda5766806d3d44a91e3b9d4dee48ff154f3d8490165399873fd\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"networking-console-plugin\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/cert\\\",\\\"name\\\":\\\"networking-console-plugin-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/etc/nginx/nginx.conf\\\",\\\"name\\\":\\\"nginx-conf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-console\"/\"networking-console-plugin-85b44fc459-gdk6g\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.577434 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"9c158521-712e-4c94-8acf-5244e32666a5\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:09Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://9b5f099148701486d974afe4fdb3a4c49321ef5d38d43dba7374f6fd4075a396\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/pki/tls/metrics-cert\\\",\\\"name\\\":\\\"ovn-control-plane-metrics-cert\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://40acddc2ab4edcf5008b66f7adba2224aedda5e58af199ee9f52864a28bd4fec\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"ovnkube-cluster-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:27:10Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/run/ovnkube-config/\\\",\\\"name\\\":\\\"ovnkube-config\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-lc2jk\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:09Z\\\"}}\" for pod \"openshift-ovn-kubernetes\"/\"ovnkube-control-plane-749d76644c-r9j25\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.593829 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-multus/network-metrics-daemon-wg68w" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"2c369253-313a-484c-bc8a-dae99abab086\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:10Z\\\",\\\"message\\\":\\\"containers with unready status: [network-metrics-daemon kube-rbac-proxy]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/metrics\\\",\\\"name\\\":\\\"metrics-certs\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:d98bb346a17feae024d92663df92b25c120938395ab7043afbed543c6db9ca8d\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-metrics-daemon\\\",\\\"ready\\\":false,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-zj6tv\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:27:10Z\\\"}}\" for pod \"openshift-multus\"/\"network-metrics-daemon-wg68w\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.613712 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-operator/network-operator-58b4c7f79c-55gtf" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"37a5e44f-9a88-4405-be8a-b645485e7312\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://cdaf6529ba2755ae53577f1b84368326e61e5e0e1e67773738eca6e9978ab5de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"network-operator\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes\\\",\\\"name\\\":\\\"host-etc-kube\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/serving-cert\\\",\\\"name\\\":\\\"metrics-tls\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-rdwmf\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-operator\"/\"network-operator-58b4c7f79c-55gtf\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.630362 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-node-identity/network-node-identity-vrzqb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"ef543e1b-8068-4ea3-b32a-61027b32e95d\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:52Z\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://2091ef446948c057b74973fd53fa20d05953c7bb6af9c2ca3fbb20b53f92ee2a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"approver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://c02684fa4884cb6687a8358f36e728cc4438565318bff7df73afcfa9fef2f85a\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:174f36cdd47ef0d1d2099482919d773257453265a2af0b17b154edc32fa41ac2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"webhook\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:52Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/webhook-cert/\\\",\\\"name\\\":\\\"webhook-cert\\\"},{\\\"mountPath\\\":\\\"/env\\\",\\\"name\\\":\\\"env-overrides\\\"},{\\\"mountPath\\\":\\\"/var/run/ovnkube-identity-config\\\",\\\"name\\\":\\\"ovnkube-identity-cm\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-s2kz5\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}]}}\" for pod \"openshift-network-node-identity\"/\"network-node-identity-vrzqb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.634899 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.634988 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.635009 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.635065 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.635083 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.647365 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-dns/node-resolver-h7gqd" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3488c03b-583a-49f2-818a-0b2d55648e51\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:57Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://f91ba85d68793afdfabf7885852a685a40db1ff5d5fcbed2b3a11b81294aadf3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:35512335ac39aed0f55b7f799f416f4f6445c20c1b19888cf2bb72bb276703f2\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"dns-node-resolver\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/hosts\\\",\\\"name\\\":\\\"hosts-file\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-fgj4m\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-dns\"/\"node-resolver-h7gqd\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.667327 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"62392df6-29ca-4dfc-b3ab-db13388a43a6\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:55Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:56Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://7f713d9492adcf65e932e4e7d03d3ffbfb93690c35ac0834a7edc956143f31cb\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:242b3d66438c42745f4ef318bdeaf3d793426f12962a42ea83e18d06c08aaf09\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-rbac-proxy\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/tls/private\\\",\\\"name\\\":\\\"proxy-tls\\\"},{\\\"mountPath\\\":\\\"/etc/kube-rbac-proxy\\\",\\\"name\\\":\\\"mcd-auth-proxy-config\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]},{\\\"containerID\\\":\\\"cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c915fb8ba96e911699a1ae34a8e95ca8a9fbe1bf8c28fea177225c63a8bdfc0a\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"machine-config-daemon\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:56Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/rootfs\\\",\\\"name\\\":\\\"rootfs\\\"},{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-2l7vl\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:55Z\\\"}}\" for pod \"openshift-machine-config-operator\"/\"machine-config-daemon-lx2wb\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.688532 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"a6604765-b55e-43a6-a5b9-2ec9e09581d1\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:34Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:27:27Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://4fbdbbf942863f10cfc4bc8e2f7c73464b39cb61e6c8f584f54e3d22145893f3\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ea3d8ab79c683fd4ab04e5368321e4d4a082226ec096c29932fc06b015027bda\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://9994b559c968737a2b49abade9e168e4699694e8df8912b69290decef9a09988\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:5b881c97aa8e440c6b3ca001edfd789a9380066b8f11f35a8dd8d88c5c7dbf86\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-scheduler-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:35Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"initContainerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"wait-for-host-port\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":false,\\\"state\\\":{\\\"terminated\\\":{\\\"containerID\\\":\\\"cri-o://47037478cd3d93ed6a5be33c0443c40c0543a5d23b6f1c12f315402b7cc37e7e\\\",\\\"exitCode\\\":0,\\\"finishedAt\\\":\\\"2026-01-31T04:26:33Z\\\",\\\"reason\\\":\\\"Completed\\\",\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}}}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-scheduler\"/\"openshift-kube-scheduler-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.708177 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"c21890f1-2803-4ee8-a48d-7f93a791c876\\\"},\\\"status\\\":{\\\"conditions\\\":[{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:35Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Initialized\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:53Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"ContainersReady\\\"},{\\\"lastProbeTime\\\":null,\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:32Z\\\",\\\"status\\\":\\\"True\\\",\\\"type\\\":\\\"PodScheduled\\\"}],\\\"containerStatuses\\\":[{\\\"containerID\\\":\\\"cri-o://23ac197e75c345badf70d181677195961d4c07666692d06478e3fcf40b87261c\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c0f9da410c07372b6c9ad6a79379b491cd10fdee88051c026b084652d85aed21\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"cluster-policy-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792\\\",\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"imageID\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:06bc35825771aee1220d34720243b89c4ba8a8b335e6de2597126bd791fd90d4\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:33Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://ba3e0e21c259792ec108dca3592e7f8c8705fd43580eeed69f929c6674967856\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-cert-syncer\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]},{\\\"containerID\\\":\\\"cri-o://68f3574672e27b38478e3ece120f54459083d7590ee12f49bdd854fa6b0a3d04\\\",\\\"image\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"imageID\\\":\\\"quay.io/crcont/openshift-crc-cluster-kube-controller-manager-operator@sha256:8506ce0a578bc18fac117eb2b82799488ffac0bed08287faaf92edaf5d17ab95\\\",\\\"lastState\\\":{},\\\"name\\\":\\\"kube-controller-manager-recovery-controller\\\",\\\"ready\\\":true,\\\"restartCount\\\":0,\\\"started\\\":true,\\\"state\\\":{\\\"running\\\":{\\\"startedAt\\\":\\\"2026-01-31T04:26:34Z\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-resources\\\",\\\"name\\\":\\\"resource-dir\\\"},{\\\"mountPath\\\":\\\"/etc/kubernetes/static-pod-certs\\\",\\\"name\\\":\\\"cert-dir\\\"}]}],\\\"hostIP\\\":\\\"192.168.126.11\\\",\\\"hostIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"phase\\\":\\\"Running\\\",\\\"podIP\\\":\\\"192.168.126.11\\\",\\\"podIPs\\\":[{\\\"ip\\\":\\\"192.168.126.11\\\"}],\\\"startTime\\\":\\\"2026-01-31T04:26:32Z\\\"}}\" for pod \"openshift-kube-controller-manager\"/\"kube-controller-manager-crc\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.726702 4812 status_manager.go:875] "Failed to update status for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" err="failed to patch status \"{\\\"metadata\\\":{\\\"uid\\\":\\\"3b6479f0-333b-4a96-9adf-2099afdc2447\\\"},\\\"status\\\":{\\\"$setElementOrder/conditions\\\":[{\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"type\\\":\\\"Initialized\\\"},{\\\"type\\\":\\\"Ready\\\"},{\\\"type\\\":\\\"ContainersReady\\\"},{\\\"type\\\":\\\"PodScheduled\\\"}],\\\"conditions\\\":[{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"PodReadyToStartContainers\\\"},{\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"type\\\":\\\"Ready\\\"},{\\\"lastTransitionTime\\\":\\\"2026-01-31T04:26:49Z\\\",\\\"message\\\":\\\"containers with unready status: [network-check-target-container]\\\",\\\"reason\\\":\\\"ContainersNotReady\\\",\\\"status\\\":\\\"False\\\",\\\"type\\\":\\\"ContainersReady\\\"}],\\\"containerStatuses\\\":[{\\\"image\\\":\\\"quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:e1baa38811c04bd8909e01a1f3be7421a1cb99d608d3dc4cf86d95b17de2ab8b\\\",\\\"imageID\\\":\\\"\\\",\\\"lastState\\\":{\\\"terminated\\\":{\\\"exitCode\\\":137,\\\"finishedAt\\\":null,\\\"message\\\":\\\"The container could not be located when the pod was deleted. The container used to be Running\\\",\\\"reason\\\":\\\"ContainerStatusUnknown\\\",\\\"startedAt\\\":null}},\\\"name\\\":\\\"network-check-target-container\\\",\\\"ready\\\":false,\\\"restartCount\\\":3,\\\"started\\\":false,\\\"state\\\":{\\\"waiting\\\":{\\\"reason\\\":\\\"ContainerCreating\\\"}},\\\"volumeMounts\\\":[{\\\"mountPath\\\":\\\"/var/run/secrets/kubernetes.io/serviceaccount\\\",\\\"name\\\":\\\"kube-api-access-cqllr\\\",\\\"readOnly\\\":true,\\\"recursiveReadOnly\\\":\\\"Disabled\\\"}]}],\\\"podIP\\\":null,\\\"podIPs\\\":null}}\" for pod \"openshift-network-diagnostics\"/\"network-check-target-xd92c\": Internal error occurred: failed calling webhook \"pod.network-node-identity.openshift.io\": failed to call webhook: Post \"https://127.0.0.1:9743/pod?timeout=10s\": tls: failed to verify certificate: x509: certificate has expired or is not yet valid: current time 2026-01-31T04:27:52Z is after 2025-08-24T17:21:41Z" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.738833 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.738898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.738910 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.738926 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.738939 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.841638 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.841693 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.841711 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.841736 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.841755 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.944231 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.944290 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.944310 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.944333 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:52 crc kubenswrapper[4812]: I0131 04:27:52.944350 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:52Z","lastTransitionTime":"2026-01-31T04:27:52Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.046826 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.046950 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.046970 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.046995 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.047012 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.149889 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.149950 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.150010 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.150035 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.150053 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.252824 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.252903 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.252920 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.252945 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.252962 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.328912 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-18 19:48:14.372947145 +0000 UTC Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.339266 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:53 crc kubenswrapper[4812]: E0131 04:27:53.339603 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.339291 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:53 crc kubenswrapper[4812]: E0131 04:27:53.340085 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.339281 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:53 crc kubenswrapper[4812]: E0131 04:27:53.340507 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.355545 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.355591 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.355607 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.355629 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.355646 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.477249 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.477311 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.477330 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.477353 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.477370 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.580503 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.580870 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.581049 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.581235 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.581391 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.684053 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.684117 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.684141 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.684168 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.684189 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.787020 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.787087 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.787116 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.787142 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.787159 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.890466 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.890528 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.890546 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.890570 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.890587 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.993394 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.993431 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.993441 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.993456 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:53 crc kubenswrapper[4812]: I0131 04:27:53.993468 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:53Z","lastTransitionTime":"2026-01-31T04:27:53Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.096154 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.096216 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.096232 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.096255 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.096271 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.199014 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.199055 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.199066 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.199081 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.199091 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.303346 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.303593 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.303619 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.303651 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.303687 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.329408 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-11-30 03:42:22.075743088 +0000 UTC Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.339439 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:54 crc kubenswrapper[4812]: E0131 04:27:54.339645 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.407048 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.407117 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.407140 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.407167 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.407185 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.510134 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.510166 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.510176 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.510190 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.510202 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.613064 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.613137 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.613200 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.613224 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.613241 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.716623 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.716670 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.716686 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.716707 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.716726 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.819833 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.819929 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.819945 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.819969 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.819985 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.922243 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.922318 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.922342 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.922374 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:54 crc kubenswrapper[4812]: I0131 04:27:54.922396 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:54Z","lastTransitionTime":"2026-01-31T04:27:54Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.024527 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.024584 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.024597 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.024613 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.024624 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.127228 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.127264 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.127272 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.127285 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.127295 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.230161 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.230198 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.230207 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.230221 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.230231 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.330546 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-24 02:54:43.503027265 +0000 UTC Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.332268 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.332324 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.332347 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.332376 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.332400 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.338523 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.338596 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.338596 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.338778 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.338861 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.339149 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.435338 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.435411 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.435424 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.435439 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.435451 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.440271 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.440527 4812 configmap.go:193] Couldn't get configMap openshift-network-console/networking-console-plugin: object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.440644 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.440618223 +0000 UTC m=+147.935639918 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "nginx-conf" (UniqueName: "kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.537727 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.537778 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.537793 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.537814 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.537829 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.541515 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.541649 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541687 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.541664256 +0000 UTC m=+148.036685931 (durationBeforeRetry 1m4s). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.541721 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541781 4812 secret.go:188] Couldn't get secret openshift-network-console/networking-console-plugin-cert: object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.541796 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541871 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert podName:5fe485a1-e14f-4c09-b5b9-f252bc42b7e8 nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.54182469 +0000 UTC m=+148.036846395 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert") pod "networking-console-plugin-85b44fc459-gdk6g" (UID: "5fe485a1-e14f-4c09-b5b9-f252bc42b7e8") : object "openshift-network-console"/"networking-console-plugin-cert" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541904 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541922 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541942 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541943 4812 projected.go:288] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541955 4812 projected.go:194] Error preparing data for projected volume kube-api-access-cqllr for pod openshift-network-diagnostics/network-check-target-xd92c: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541967 4812 projected.go:194] Error preparing data for projected volume kube-api-access-s2dwl for pod openshift-network-diagnostics/network-check-source-55646444c4-trplf: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.541990 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr podName:3b6479f0-333b-4a96-9adf-2099afdc2447 nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.541978214 +0000 UTC m=+148.036999889 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-cqllr" (UniqueName: "kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr") pod "network-check-target-xd92c" (UID: "3b6479f0-333b-4a96-9adf-2099afdc2447") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:55 crc kubenswrapper[4812]: E0131 04:27:55.542025 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl podName:9d751cbb-f2e2-430d-9754-c882a5e924a5 nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.542006415 +0000 UTC m=+148.037028120 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "kube-api-access-s2dwl" (UniqueName: "kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl") pod "network-check-source-55646444c4-trplf" (UID: "9d751cbb-f2e2-430d-9754-c882a5e924a5") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.640683 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.640749 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.640770 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.640792 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.640809 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.744330 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.744406 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.744430 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.744462 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.744485 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.847088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.847152 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.847169 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.847191 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.847207 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.949585 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.949618 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.949626 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.949657 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:55 crc kubenswrapper[4812]: I0131 04:27:55.949667 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:55Z","lastTransitionTime":"2026-01-31T04:27:55Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.053712 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.053821 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.053889 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.053913 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.053930 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.157603 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.157662 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.157676 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.157694 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.157708 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.260456 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.260514 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.260539 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.260567 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.260589 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.330692 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-09 19:00:44.404060504 +0000 UTC Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.339378 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:56 crc kubenswrapper[4812]: E0131 04:27:56.339592 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.363734 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.363774 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.363790 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.363812 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.363828 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.466984 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.467042 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.467065 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.467087 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.467103 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.570019 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.570078 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.570095 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.570120 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.570137 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.673594 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.673647 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.673667 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.673690 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.673706 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.777426 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.777481 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.777496 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.777518 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.777534 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.881132 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.881192 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.881208 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.881230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.881248 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.987926 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.988296 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.988308 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.988323 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:56 crc kubenswrapper[4812]: I0131 04:27:56.988333 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:56Z","lastTransitionTime":"2026-01-31T04:27:56Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.090718 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.090762 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.090773 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.090788 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.090798 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.194680 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.194755 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.194768 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.194786 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.194799 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.297423 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.297477 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.297493 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.297510 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.297522 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.330942 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2026-01-14 21:15:31.729485768 +0000 UTC Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.339285 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.339344 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.339394 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:57 crc kubenswrapper[4812]: E0131 04:27:57.339476 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:57 crc kubenswrapper[4812]: E0131 04:27:57.339618 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:57 crc kubenswrapper[4812]: E0131 04:27:57.339756 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.400190 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.400252 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.400265 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.400285 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.400299 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.503587 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.503636 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.503645 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.503660 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.503669 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.606962 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.607037 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.607058 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.607088 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.607110 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.709221 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.709269 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.709283 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.709300 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.709313 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.811942 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.812008 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.812228 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.812245 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.812258 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.915164 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.915225 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.915241 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.915264 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:57 crc kubenswrapper[4812]: I0131 04:27:57.915281 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:57Z","lastTransitionTime":"2026-01-31T04:27:57Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.018212 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.018252 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.018262 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.018278 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.018293 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.120100 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.120137 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.120151 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.120166 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.120180 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.222911 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.222977 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.223000 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.223028 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.223051 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.326501 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.326563 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.326580 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.326601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.326619 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.331639 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-01 16:05:54.1854664 +0000 UTC Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.339017 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:27:58 crc kubenswrapper[4812]: E0131 04:27:58.339175 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.429427 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.429464 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.429474 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.429490 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.429503 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.532901 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.533196 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.533348 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.533500 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.533635 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.636792 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.637067 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.637209 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.637340 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.637503 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.740139 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.740274 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.740294 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.740327 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.740346 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.842722 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.842799 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.842827 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.842898 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.842922 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.945727 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.945781 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.945799 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.945824 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:58 crc kubenswrapper[4812]: I0131 04:27:58.945879 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:58Z","lastTransitionTime":"2026-01-31T04:27:58Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.049160 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.049193 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.049203 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.049219 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.049229 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.152119 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.152187 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.152210 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.152238 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.152260 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.255463 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.255521 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.255541 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.255570 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.255588 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.332029 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-26 11:59:01.317243927 +0000 UTC Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.339364 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.339433 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.339434 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:27:59 crc kubenswrapper[4812]: E0131 04:27:59.339540 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:27:59 crc kubenswrapper[4812]: E0131 04:27:59.339671 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:27:59 crc kubenswrapper[4812]: E0131 04:27:59.339723 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.359041 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.359087 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.359107 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.359131 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.359151 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.461719 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.461779 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.461801 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.461831 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.462242 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.564290 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.564331 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.564343 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.564360 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.564372 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.667084 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.667137 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.667149 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.667170 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.667183 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.770230 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.770291 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.770302 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.770318 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.770329 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.773536 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientMemory" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.773575 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasNoDiskPressure" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.773586 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeHasSufficientPID" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.773601 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeNotReady" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.773612 4812 setters.go:603] "Node became not ready" node="crc" condition={"type":"Ready","status":"False","lastHeartbeatTime":"2026-01-31T04:27:59Z","lastTransitionTime":"2026-01-31T04:27:59Z","reason":"KubeletNotReady","message":"container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?"} Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.845486 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7"] Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.846611 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.848221 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.849245 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.849247 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.850478 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.867084 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-crc" podStartSLOduration=19.867055519 podStartE2EDuration="19.867055519s" podCreationTimestamp="2026-01-31 04:27:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:27:59.866754271 +0000 UTC m=+88.361775976" watchObservedRunningTime="2026-01-31 04:27:59.867055519 +0000 UTC m=+88.362077234" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.910713 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=70.910683917 podStartE2EDuration="1m10.910683917s" podCreationTimestamp="2026-01-31 04:26:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:27:59.891045182 +0000 UTC m=+88.386066887" watchObservedRunningTime="2026-01-31 04:27:59.910683917 +0000 UTC m=+88.405705622" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.985749 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-pnwcx" podStartSLOduration=64.985724384 podStartE2EDuration="1m4.985724384s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:27:59.953943713 +0000 UTC m=+88.448965398" watchObservedRunningTime="2026-01-31 04:27:59.985724384 +0000 UTC m=+88.480746089" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.991533 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14bd8c9d-fb46-4f50-b323-369f0a457141-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.991612 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/14bd8c9d-fb46-4f50-b323-369f0a457141-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.991728 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14bd8c9d-fb46-4f50-b323-369f0a457141-service-ca\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.991878 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14bd8c9d-fb46-4f50-b323-369f0a457141-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:27:59 crc kubenswrapper[4812]: I0131 04:27:59.991940 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/14bd8c9d-fb46-4f50-b323-369f0a457141-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.009336 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2vzj6" podStartSLOduration=65.009320575 podStartE2EDuration="1m5.009320575s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:27:59.988305683 +0000 UTC m=+88.483327358" watchObservedRunningTime="2026-01-31 04:28:00.009320575 +0000 UTC m=+88.504342250" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.041797 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-h7gqd" podStartSLOduration=65.041778173 podStartE2EDuration="1m5.041778173s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.041127046 +0000 UTC m=+88.536148711" watchObservedRunningTime="2026-01-31 04:28:00.041778173 +0000 UTC m=+88.536799848" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.058490 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podStartSLOduration=65.05847427 podStartE2EDuration="1m5.05847427s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.057900965 +0000 UTC m=+88.552922690" watchObservedRunningTime="2026-01-31 04:28:00.05847427 +0000 UTC m=+88.553495945" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.078177 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-control-plane-749d76644c-r9j25" podStartSLOduration=65.078157066 podStartE2EDuration="1m5.078157066s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.077292464 +0000 UTC m=+88.572314139" watchObservedRunningTime="2026-01-31 04:28:00.078157066 +0000 UTC m=+88.573178731" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.092482 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/14bd8c9d-fb46-4f50-b323-369f0a457141-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.092567 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14bd8c9d-fb46-4f50-b323-369f0a457141-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.092606 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/14bd8c9d-fb46-4f50-b323-369f0a457141-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.092628 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14bd8c9d-fb46-4f50-b323-369f0a457141-service-ca\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.092659 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14bd8c9d-fb46-4f50-b323-369f0a457141-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.092962 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-ssl-certs\" (UniqueName: \"kubernetes.io/host-path/14bd8c9d-fb46-4f50-b323-369f0a457141-etc-ssl-certs\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.093230 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-cvo-updatepayloads\" (UniqueName: \"kubernetes.io/host-path/14bd8c9d-fb46-4f50-b323-369f0a457141-etc-cvo-updatepayloads\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.094197 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/14bd8c9d-fb46-4f50-b323-369f0a457141-service-ca\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.109005 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/14bd8c9d-fb46-4f50-b323-369f0a457141-serving-cert\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.118349 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/14bd8c9d-fb46-4f50-b323-369f0a457141-kube-api-access\") pod \"cluster-version-operator-5c965bbfc6-stws7\" (UID: \"14bd8c9d-fb46-4f50-b323-369f0a457141\") " pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.119001 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler/openshift-kube-scheduler-crc" podStartSLOduration=33.118971818 podStartE2EDuration="33.118971818s" podCreationTimestamp="2026-01-31 04:27:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.117673003 +0000 UTC m=+88.612694708" watchObservedRunningTime="2026-01-31 04:28:00.118971818 +0000 UTC m=+88.613993533" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.136127 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/kube-controller-manager-crc" podStartSLOduration=67.136105277 podStartE2EDuration="1m7.136105277s" podCreationTimestamp="2026-01-31 04:26:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.135377757 +0000 UTC m=+88.630399432" watchObservedRunningTime="2026-01-31 04:28:00.136105277 +0000 UTC m=+88.631126982" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.172351 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" Jan 31 04:28:00 crc kubenswrapper[4812]: W0131 04:28:00.194050 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod14bd8c9d_fb46_4f50_b323_369f0a457141.slice/crio-193358804b151d175e59753f8a97bfe484e4e32edaf319d04423c0cb6aa47538 WatchSource:0}: Error finding container 193358804b151d175e59753f8a97bfe484e4e32edaf319d04423c0cb6aa47538: Status 404 returned error can't find the container with id 193358804b151d175e59753f8a97bfe484e4e32edaf319d04423c0cb6aa47538 Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.196912 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd/etcd-crc" podStartSLOduration=66.196892573 podStartE2EDuration="1m6.196892573s" podCreationTimestamp="2026-01-31 04:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.194814478 +0000 UTC m=+88.689836163" watchObservedRunningTime="2026-01-31 04:28:00.196892573 +0000 UTC m=+88.691914248" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.242719 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-kctmd" podStartSLOduration=65.242696598 podStartE2EDuration="1m5.242696598s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.242108582 +0000 UTC m=+88.737130287" watchObservedRunningTime="2026-01-31 04:28:00.242696598 +0000 UTC m=+88.737718273" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.332206 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2026-02-24 05:53:03 +0000 UTC, rotation deadline is 2025-12-30 02:51:02.210576454 +0000 UTC Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.332283 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Rotating certificates Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.338893 4812 reflector.go:368] Caches populated for *v1.CertificateSigningRequest from k8s.io/client-go/tools/watch/informerwatcher.go:146 Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.339959 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:00 crc kubenswrapper[4812]: E0131 04:28:00.340086 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.851827 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" event={"ID":"14bd8c9d-fb46-4f50-b323-369f0a457141","Type":"ContainerStarted","Data":"374aa02abd2d295b0996c9948d3fc8c60a39e4e27d83fc93aa22c7be4f31e995"} Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.851955 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" event={"ID":"14bd8c9d-fb46-4f50-b323-369f0a457141","Type":"ContainerStarted","Data":"193358804b151d175e59753f8a97bfe484e4e32edaf319d04423c0cb6aa47538"} Jan 31 04:28:00 crc kubenswrapper[4812]: I0131 04:28:00.871814 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-version/cluster-version-operator-5c965bbfc6-stws7" podStartSLOduration=65.871796616 podStartE2EDuration="1m5.871796616s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:00.871323434 +0000 UTC m=+89.366345159" watchObservedRunningTime="2026-01-31 04:28:00.871796616 +0000 UTC m=+89.366818281" Jan 31 04:28:01 crc kubenswrapper[4812]: I0131 04:28:01.338711 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:01 crc kubenswrapper[4812]: I0131 04:28:01.339585 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:28:01 crc kubenswrapper[4812]: I0131 04:28:01.338787 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:01 crc kubenswrapper[4812]: I0131 04:28:01.338790 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:01 crc kubenswrapper[4812]: E0131 04:28:01.339798 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:28:01 crc kubenswrapper[4812]: E0131 04:28:01.339821 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:01 crc kubenswrapper[4812]: E0131 04:28:01.339927 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:01 crc kubenswrapper[4812]: E0131 04:28:01.340189 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:02 crc kubenswrapper[4812]: I0131 04:28:02.338807 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:02 crc kubenswrapper[4812]: E0131 04:28:02.342099 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:03 crc kubenswrapper[4812]: I0131 04:28:03.339077 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:03 crc kubenswrapper[4812]: I0131 04:28:03.339137 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:03 crc kubenswrapper[4812]: I0131 04:28:03.339178 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:03 crc kubenswrapper[4812]: E0131 04:28:03.339272 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:03 crc kubenswrapper[4812]: E0131 04:28:03.339393 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:03 crc kubenswrapper[4812]: E0131 04:28:03.339527 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:04 crc kubenswrapper[4812]: I0131 04:28:04.339524 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:04 crc kubenswrapper[4812]: E0131 04:28:04.339706 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:05 crc kubenswrapper[4812]: I0131 04:28:05.338831 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:05 crc kubenswrapper[4812]: I0131 04:28:05.338896 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:05 crc kubenswrapper[4812]: I0131 04:28:05.338862 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:05 crc kubenswrapper[4812]: E0131 04:28:05.339046 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:05 crc kubenswrapper[4812]: E0131 04:28:05.339234 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:05 crc kubenswrapper[4812]: E0131 04:28:05.339452 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:06 crc kubenswrapper[4812]: I0131 04:28:06.339663 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:06 crc kubenswrapper[4812]: E0131 04:28:06.339927 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:07 crc kubenswrapper[4812]: I0131 04:28:07.339069 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:07 crc kubenswrapper[4812]: I0131 04:28:07.339188 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:07 crc kubenswrapper[4812]: I0131 04:28:07.339069 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:07 crc kubenswrapper[4812]: E0131 04:28:07.339245 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:07 crc kubenswrapper[4812]: E0131 04:28:07.339374 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:07 crc kubenswrapper[4812]: E0131 04:28:07.339576 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:08 crc kubenswrapper[4812]: I0131 04:28:08.339052 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:08 crc kubenswrapper[4812]: E0131 04:28:08.339207 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:09 crc kubenswrapper[4812]: I0131 04:28:09.338648 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:09 crc kubenswrapper[4812]: I0131 04:28:09.338692 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:09 crc kubenswrapper[4812]: E0131 04:28:09.338770 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:09 crc kubenswrapper[4812]: I0131 04:28:09.338885 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:09 crc kubenswrapper[4812]: E0131 04:28:09.338914 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:09 crc kubenswrapper[4812]: E0131 04:28:09.339082 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:10 crc kubenswrapper[4812]: I0131 04:28:10.340993 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:10 crc kubenswrapper[4812]: E0131 04:28:10.341250 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:11 crc kubenswrapper[4812]: I0131 04:28:11.338966 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:11 crc kubenswrapper[4812]: I0131 04:28:11.338984 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:11 crc kubenswrapper[4812]: I0131 04:28:11.339101 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:11 crc kubenswrapper[4812]: E0131 04:28:11.339235 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:11 crc kubenswrapper[4812]: E0131 04:28:11.339463 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:11 crc kubenswrapper[4812]: E0131 04:28:11.339833 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:12 crc kubenswrapper[4812]: I0131 04:28:12.339300 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:12 crc kubenswrapper[4812]: E0131 04:28:12.341274 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:13 crc kubenswrapper[4812]: I0131 04:28:13.339077 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:13 crc kubenswrapper[4812]: E0131 04:28:13.339720 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:13 crc kubenswrapper[4812]: I0131 04:28:13.339172 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:13 crc kubenswrapper[4812]: E0131 04:28:13.339865 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:13 crc kubenswrapper[4812]: I0131 04:28:13.339121 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:13 crc kubenswrapper[4812]: E0131 04:28:13.339981 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:14 crc kubenswrapper[4812]: I0131 04:28:14.338817 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:14 crc kubenswrapper[4812]: E0131 04:28:14.338958 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:14 crc kubenswrapper[4812]: I0131 04:28:14.340041 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:28:14 crc kubenswrapper[4812]: E0131 04:28:14.340297 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:28:14 crc kubenswrapper[4812]: I0131 04:28:14.690050 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:14 crc kubenswrapper[4812]: E0131 04:28:14.690233 4812 secret.go:188] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:28:14 crc kubenswrapper[4812]: E0131 04:28:14.690318 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs podName:2c369253-313a-484c-bc8a-dae99abab086 nodeName:}" failed. No retries permitted until 2026-01-31 04:29:18.690295508 +0000 UTC m=+167.185317203 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs") pod "network-metrics-daemon-wg68w" (UID: "2c369253-313a-484c-bc8a-dae99abab086") : object "openshift-multus"/"metrics-daemon-secret" not registered Jan 31 04:28:15 crc kubenswrapper[4812]: I0131 04:28:15.338766 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:15 crc kubenswrapper[4812]: I0131 04:28:15.338829 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:15 crc kubenswrapper[4812]: I0131 04:28:15.338977 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:15 crc kubenswrapper[4812]: E0131 04:28:15.339137 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:15 crc kubenswrapper[4812]: E0131 04:28:15.339447 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:15 crc kubenswrapper[4812]: E0131 04:28:15.339329 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:16 crc kubenswrapper[4812]: I0131 04:28:16.339966 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:16 crc kubenswrapper[4812]: E0131 04:28:16.340113 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:17 crc kubenswrapper[4812]: I0131 04:28:17.339298 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:17 crc kubenswrapper[4812]: I0131 04:28:17.339365 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:17 crc kubenswrapper[4812]: E0131 04:28:17.339489 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:17 crc kubenswrapper[4812]: I0131 04:28:17.339583 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:17 crc kubenswrapper[4812]: E0131 04:28:17.339767 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:17 crc kubenswrapper[4812]: E0131 04:28:17.339951 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:18 crc kubenswrapper[4812]: I0131 04:28:18.338910 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:18 crc kubenswrapper[4812]: E0131 04:28:18.340287 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:19 crc kubenswrapper[4812]: I0131 04:28:19.339303 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:19 crc kubenswrapper[4812]: I0131 04:28:19.339331 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:19 crc kubenswrapper[4812]: I0131 04:28:19.339441 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:19 crc kubenswrapper[4812]: E0131 04:28:19.339598 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:19 crc kubenswrapper[4812]: E0131 04:28:19.339770 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:19 crc kubenswrapper[4812]: E0131 04:28:19.339931 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:20 crc kubenswrapper[4812]: I0131 04:28:20.339676 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:20 crc kubenswrapper[4812]: E0131 04:28:20.340232 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:21 crc kubenswrapper[4812]: I0131 04:28:21.338499 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:21 crc kubenswrapper[4812]: E0131 04:28:21.338626 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:21 crc kubenswrapper[4812]: I0131 04:28:21.338730 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:21 crc kubenswrapper[4812]: I0131 04:28:21.338740 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:21 crc kubenswrapper[4812]: E0131 04:28:21.338981 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:21 crc kubenswrapper[4812]: E0131 04:28:21.339052 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:22 crc kubenswrapper[4812]: I0131 04:28:22.338569 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:22 crc kubenswrapper[4812]: E0131 04:28:22.339680 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:23 crc kubenswrapper[4812]: I0131 04:28:23.338682 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:23 crc kubenswrapper[4812]: I0131 04:28:23.338807 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:23 crc kubenswrapper[4812]: I0131 04:28:23.338870 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:23 crc kubenswrapper[4812]: E0131 04:28:23.339036 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:23 crc kubenswrapper[4812]: E0131 04:28:23.339137 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:23 crc kubenswrapper[4812]: E0131 04:28:23.339313 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:24 crc kubenswrapper[4812]: I0131 04:28:24.339175 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:24 crc kubenswrapper[4812]: E0131 04:28:24.340120 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:25 crc kubenswrapper[4812]: I0131 04:28:25.338876 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:25 crc kubenswrapper[4812]: I0131 04:28:25.338884 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:25 crc kubenswrapper[4812]: I0131 04:28:25.338881 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:25 crc kubenswrapper[4812]: E0131 04:28:25.339245 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:25 crc kubenswrapper[4812]: E0131 04:28:25.339568 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:25 crc kubenswrapper[4812]: E0131 04:28:25.339631 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:26 crc kubenswrapper[4812]: I0131 04:28:26.339069 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:26 crc kubenswrapper[4812]: E0131 04:28:26.339304 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:27 crc kubenswrapper[4812]: I0131 04:28:27.339043 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:27 crc kubenswrapper[4812]: I0131 04:28:27.339092 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:27 crc kubenswrapper[4812]: I0131 04:28:27.339522 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:27 crc kubenswrapper[4812]: E0131 04:28:27.339926 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:27 crc kubenswrapper[4812]: E0131 04:28:27.339986 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:27 crc kubenswrapper[4812]: I0131 04:28:27.340033 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:28:27 crc kubenswrapper[4812]: E0131 04:28:27.340068 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:27 crc kubenswrapper[4812]: E0131 04:28:27.340521 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"ovnkube-controller\" with CrashLoopBackOff: \"back-off 40s restarting failed container=ovnkube-controller pod=ovnkube-node-bl2f9_openshift-ovn-kubernetes(d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a)\"" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" Jan 31 04:28:28 crc kubenswrapper[4812]: I0131 04:28:28.338936 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:28 crc kubenswrapper[4812]: E0131 04:28:28.339169 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.339002 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.339061 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.339160 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:29 crc kubenswrapper[4812]: E0131 04:28:29.339361 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:29 crc kubenswrapper[4812]: E0131 04:28:29.339723 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:29 crc kubenswrapper[4812]: E0131 04:28:29.339601 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.945791 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/1.log" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.946620 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/0.log" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.946683 4812 generic.go:334] "Generic (PLEG): container finished" podID="6050f642-2492-4f83-a739-ac905c409b8c" containerID="4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb" exitCode=1 Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.946720 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerDied","Data":"4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb"} Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.946773 4812 scope.go:117] "RemoveContainer" containerID="d5f4d70805c4521d6ef458d3290ed2da52b29b482290ca9729ad7aada9fd8049" Jan 31 04:28:29 crc kubenswrapper[4812]: I0131 04:28:29.947503 4812 scope.go:117] "RemoveContainer" containerID="4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb" Jan 31 04:28:29 crc kubenswrapper[4812]: E0131 04:28:29.947943 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 10s restarting failed container=kube-multus pod=multus-pnwcx_openshift-multus(6050f642-2492-4f83-a739-ac905c409b8c)\"" pod="openshift-multus/multus-pnwcx" podUID="6050f642-2492-4f83-a739-ac905c409b8c" Jan 31 04:28:30 crc kubenswrapper[4812]: I0131 04:28:30.339572 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:30 crc kubenswrapper[4812]: E0131 04:28:30.339795 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:30 crc kubenswrapper[4812]: I0131 04:28:30.959021 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/1.log" Jan 31 04:28:31 crc kubenswrapper[4812]: I0131 04:28:31.338557 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:31 crc kubenswrapper[4812]: I0131 04:28:31.338658 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:31 crc kubenswrapper[4812]: I0131 04:28:31.338681 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:31 crc kubenswrapper[4812]: E0131 04:28:31.338762 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:31 crc kubenswrapper[4812]: E0131 04:28:31.338816 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:31 crc kubenswrapper[4812]: E0131 04:28:31.338939 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:32 crc kubenswrapper[4812]: E0131 04:28:32.280250 4812 kubelet_node_status.go:497] "Node not becoming ready in time after startup" Jan 31 04:28:32 crc kubenswrapper[4812]: I0131 04:28:32.338554 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:32 crc kubenswrapper[4812]: E0131 04:28:32.340569 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:32 crc kubenswrapper[4812]: E0131 04:28:32.443073 4812 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 04:28:33 crc kubenswrapper[4812]: I0131 04:28:33.339488 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:33 crc kubenswrapper[4812]: E0131 04:28:33.339668 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:33 crc kubenswrapper[4812]: I0131 04:28:33.339830 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:33 crc kubenswrapper[4812]: E0131 04:28:33.340062 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:33 crc kubenswrapper[4812]: I0131 04:28:33.340400 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:33 crc kubenswrapper[4812]: E0131 04:28:33.340727 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:34 crc kubenswrapper[4812]: I0131 04:28:34.338820 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:34 crc kubenswrapper[4812]: E0131 04:28:34.339081 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:35 crc kubenswrapper[4812]: I0131 04:28:35.339331 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:35 crc kubenswrapper[4812]: E0131 04:28:35.339896 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:35 crc kubenswrapper[4812]: I0131 04:28:35.339410 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:35 crc kubenswrapper[4812]: E0131 04:28:35.340009 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:35 crc kubenswrapper[4812]: I0131 04:28:35.339390 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:35 crc kubenswrapper[4812]: E0131 04:28:35.340098 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:36 crc kubenswrapper[4812]: I0131 04:28:36.339129 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:36 crc kubenswrapper[4812]: E0131 04:28:36.339389 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:37 crc kubenswrapper[4812]: I0131 04:28:37.339616 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:37 crc kubenswrapper[4812]: I0131 04:28:37.339672 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:37 crc kubenswrapper[4812]: I0131 04:28:37.339639 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:37 crc kubenswrapper[4812]: E0131 04:28:37.340585 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:37 crc kubenswrapper[4812]: E0131 04:28:37.340728 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:37 crc kubenswrapper[4812]: E0131 04:28:37.341085 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:37 crc kubenswrapper[4812]: E0131 04:28:37.444835 4812 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 04:28:38 crc kubenswrapper[4812]: I0131 04:28:38.339301 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:38 crc kubenswrapper[4812]: E0131 04:28:38.340024 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:38 crc kubenswrapper[4812]: I0131 04:28:38.340526 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:28:38 crc kubenswrapper[4812]: I0131 04:28:38.995123 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/3.log" Jan 31 04:28:38 crc kubenswrapper[4812]: I0131 04:28:38.998255 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerStarted","Data":"1df698192345d3a05e597c4d8c10555bf54aaa604d42000637efa3cc4157d915"} Jan 31 04:28:38 crc kubenswrapper[4812]: I0131 04:28:38.998731 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:28:39 crc kubenswrapper[4812]: I0131 04:28:39.330517 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podStartSLOduration=104.330490551 podStartE2EDuration="1m44.330490551s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:39.04633904 +0000 UTC m=+127.541360715" watchObservedRunningTime="2026-01-31 04:28:39.330490551 +0000 UTC m=+127.825512236" Jan 31 04:28:39 crc kubenswrapper[4812]: I0131 04:28:39.331101 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wg68w"] Jan 31 04:28:39 crc kubenswrapper[4812]: I0131 04:28:39.331216 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:39 crc kubenswrapper[4812]: E0131 04:28:39.331356 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:39 crc kubenswrapper[4812]: I0131 04:28:39.338830 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:39 crc kubenswrapper[4812]: I0131 04:28:39.338965 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:39 crc kubenswrapper[4812]: E0131 04:28:39.339038 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:39 crc kubenswrapper[4812]: E0131 04:28:39.339216 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:39 crc kubenswrapper[4812]: I0131 04:28:39.339512 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:39 crc kubenswrapper[4812]: E0131 04:28:39.339635 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:41 crc kubenswrapper[4812]: I0131 04:28:41.339277 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:41 crc kubenswrapper[4812]: I0131 04:28:41.339286 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:41 crc kubenswrapper[4812]: I0131 04:28:41.339309 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:41 crc kubenswrapper[4812]: E0131 04:28:41.340051 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:41 crc kubenswrapper[4812]: E0131 04:28:41.339782 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:41 crc kubenswrapper[4812]: I0131 04:28:41.339364 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:41 crc kubenswrapper[4812]: E0131 04:28:41.340156 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:41 crc kubenswrapper[4812]: E0131 04:28:41.340274 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:42 crc kubenswrapper[4812]: E0131 04:28:42.446042 4812 kubelet.go:2916] "Container runtime network not ready" networkReady="NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 04:28:43 crc kubenswrapper[4812]: I0131 04:28:43.339204 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:43 crc kubenswrapper[4812]: E0131 04:28:43.339493 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:43 crc kubenswrapper[4812]: I0131 04:28:43.339252 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:43 crc kubenswrapper[4812]: E0131 04:28:43.339733 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:43 crc kubenswrapper[4812]: I0131 04:28:43.339268 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:43 crc kubenswrapper[4812]: E0131 04:28:43.339919 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:43 crc kubenswrapper[4812]: I0131 04:28:43.339223 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:43 crc kubenswrapper[4812]: E0131 04:28:43.340399 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:44 crc kubenswrapper[4812]: I0131 04:28:44.340099 4812 scope.go:117] "RemoveContainer" containerID="4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb" Jan 31 04:28:45 crc kubenswrapper[4812]: I0131 04:28:45.025538 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/1.log" Jan 31 04:28:45 crc kubenswrapper[4812]: I0131 04:28:45.026046 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerStarted","Data":"ec3f00b03424a296f7cefb61c7c1482bfba10c00049acb0b2119850f63d75f9d"} Jan 31 04:28:45 crc kubenswrapper[4812]: I0131 04:28:45.339387 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:45 crc kubenswrapper[4812]: I0131 04:28:45.339424 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:45 crc kubenswrapper[4812]: E0131 04:28:45.339566 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:45 crc kubenswrapper[4812]: I0131 04:28:45.339735 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:45 crc kubenswrapper[4812]: E0131 04:28:45.339931 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:45 crc kubenswrapper[4812]: I0131 04:28:45.339941 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:45 crc kubenswrapper[4812]: E0131 04:28:45.340205 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:45 crc kubenswrapper[4812]: E0131 04:28:45.340285 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:47 crc kubenswrapper[4812]: I0131 04:28:47.338980 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:47 crc kubenswrapper[4812]: I0131 04:28:47.339073 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:47 crc kubenswrapper[4812]: I0131 04:28:47.339073 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:47 crc kubenswrapper[4812]: I0131 04:28:47.339108 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:47 crc kubenswrapper[4812]: E0131 04:28:47.339246 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-wg68w" podUID="2c369253-313a-484c-bc8a-dae99abab086" Jan 31 04:28:47 crc kubenswrapper[4812]: E0131 04:28:47.339378 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" podUID="9d751cbb-f2e2-430d-9754-c882a5e924a5" Jan 31 04:28:47 crc kubenswrapper[4812]: E0131 04:28:47.339535 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-xd92c" podUID="3b6479f0-333b-4a96-9adf-2099afdc2447" Jan 31 04:28:47 crc kubenswrapper[4812]: E0131 04:28:47.339730 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" podUID="5fe485a1-e14f-4c09-b5b9-f252bc42b7e8" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.339098 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.339163 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.339221 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.339110 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.342592 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.344044 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.344323 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.344891 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.345358 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.347162 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 04:28:49 crc kubenswrapper[4812]: I0131 04:28:49.655552 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.464584 4812 kubelet_node_status.go:724] "Recording event message for node" node="crc" event="NodeReady" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.517830 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tt2tp"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.518933 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.523484 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6246k"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.524153 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.525300 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.528529 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.529106 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.529498 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.529705 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bw8rt"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.529939 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.530004 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.530191 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.530365 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.530570 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.531017 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.531165 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.531722 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-57sr9"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.531824 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.532349 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.532984 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.533303 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.533336 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.534333 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.534570 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.538245 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.538775 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.549199 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.550132 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.551345 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.552322 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.558010 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.559244 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.559714 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.559950 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.560237 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.560471 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.560590 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.560660 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.561644 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rm7wz"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.562187 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.562767 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.563187 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.563251 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.563318 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.563467 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.564405 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.564750 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.565220 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.565535 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.565919 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.567312 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.568259 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.574081 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.575571 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.577195 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.577556 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.577702 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.578786 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.578973 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.579151 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.579385 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.579897 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.580055 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.580289 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.580434 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.580580 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.580585 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/console-f9d7485db-4f2hw"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.580814 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.581170 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.585004 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.588487 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.602972 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.628228 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.628868 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.628908 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629167 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629202 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629253 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629288 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629323 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629353 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629623 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fxvxq"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.630160 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.629172 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.632000 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-7954f5f757-plj5m"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.632685 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.638707 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639016 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639200 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639283 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639389 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639422 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639484 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639521 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639587 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639616 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.639693 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.641408 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b2n85"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.641930 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.643251 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.644139 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.644243 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.649060 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.649631 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.649753 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.649943 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.650067 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.650187 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.650306 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.651099 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.651181 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.651311 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.651349 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.651423 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.653262 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.654540 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.655289 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.655885 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.656378 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.656487 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.658993 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5c747"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.659232 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.659536 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.659776 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42rlr"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.660016 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.660245 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-5444994796-2hcvc"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.660545 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.660560 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.661380 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.662285 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.662602 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.662864 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.662980 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.663135 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.664237 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.678949 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.683690 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.688500 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.688864 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.689096 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.689272 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tt2tp"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.689949 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.690732 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.690878 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.691525 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.693802 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.694204 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.694285 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.696158 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.696772 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.697982 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.704403 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.704512 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.704719 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.704867 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.704890 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.704972 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.705104 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.705164 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.706911 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2mv2d"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.707654 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.707805 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.707981 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.708300 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.708519 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.709928 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.710668 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.711748 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.712797 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.713386 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wh59s"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.713567 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.713730 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.714464 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.715027 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.715337 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.716031 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.716209 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7qlmj"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.716952 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.717116 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.717748 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.718021 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.718586 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.719024 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.719567 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.719949 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.721948 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.721990 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bw8rt"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.722026 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6246k"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.723325 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.725670 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b2n85"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.728049 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.728090 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lwmmp"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.729128 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.729224 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.730710 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731196 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-config\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731225 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlqc2\" (UniqueName: \"kubernetes.io/projected/a2f275c0-4422-45c9-8d3b-c022a4322df5-kube-api-access-jlqc2\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731244 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8458779-3738-4315-a142-4b5287a2b8fa-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731264 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b57ts\" (UniqueName: \"kubernetes.io/projected/dea03f3c-7f0b-4026-82e7-f3bc79397a29-kube-api-access-b57ts\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731280 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-config\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731295 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd8432b-f254-450e-9b70-e0e89ead504d-console-serving-cert\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731312 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6980a4-d26b-4132-8da4-650ed74e8a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731326 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2e5ff5-1897-488c-9823-706462fbc903-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731344 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-serving-cert\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731364 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c2893c-e678-4c5e-8692-8d50c2510ded-config\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731384 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6980a4-d26b-4132-8da4-650ed74e8a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731404 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731427 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mvzz7\" (UniqueName: \"kubernetes.io/projected/e5c2893c-e678-4c5e-8692-8d50c2510ded-kube-api-access-mvzz7\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731448 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731467 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731490 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-encryption-config\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731511 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e5c2893c-e678-4c5e-8692-8d50c2510ded-images\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731533 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d8458779-3738-4315-a142-4b5287a2b8fa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731552 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-policies\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731580 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea03f3c-7f0b-4026-82e7-f3bc79397a29-auth-proxy-config\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731603 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2zw7j\" (UniqueName: \"kubernetes.io/projected/41f9cb0c-1919-4647-bc7f-dfc345c0b6be-kube-api-access-2zw7j\") pod \"cluster-samples-operator-665b6dd947-7qdgm\" (UID: \"41f9cb0c-1919-4647-bc7f-dfc345c0b6be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731634 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43e00fbb-667f-43f7-b399-31cfcea2ba2f-metrics-tls\") pod \"dns-operator-744455d44c-fxvxq\" (UID: \"43e00fbb-667f-43f7-b399-31cfcea2ba2f\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731654 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-service-ca-bundle\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731675 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fd2t6\" (UniqueName: \"kubernetes.io/projected/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-kube-api-access-fd2t6\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731714 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731739 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731760 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5c2893c-e678-4c5e-8692-8d50c2510ded-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731780 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-image-import-ca\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731799 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-trusted-ca-bundle\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731820 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-serving-cert\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731861 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-config\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731884 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccd8432b-f254-450e-9b70-e0e89ead504d-console-oauth-config\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731904 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqxxp\" (UniqueName: \"kubernetes.io/projected/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-kube-api-access-kqxxp\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731907 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wh59s"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731923 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731945 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731945 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-dir\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.731983 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca2e5ff5-1897-488c-9823-706462fbc903-config\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732094 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f275c0-4422-45c9-8d3b-c022a4322df5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732121 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732181 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732210 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea03f3c-7f0b-4026-82e7-f3bc79397a29-machine-approver-tls\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732253 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svwxx\" (UniqueName: \"kubernetes.io/projected/43e00fbb-667f-43f7-b399-31cfcea2ba2f-kube-api-access-svwxx\") pod \"dns-operator-744455d44c-fxvxq\" (UID: \"43e00fbb-667f-43f7-b399-31cfcea2ba2f\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732272 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca2e5ff5-1897-488c-9823-706462fbc903-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732311 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-client-ca\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732364 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ca95cb69-0209-4b77-8d05-608c83cdddc2-node-pullsecrets\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732386 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732434 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sf62j\" (UniqueName: \"kubernetes.io/projected/ca95cb69-0209-4b77-8d05-608c83cdddc2-kube-api-access-sf62j\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732459 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dc5wv\" (UniqueName: \"kubernetes.io/projected/5058ec63-1bc0-4113-b436-041e7e1a37f5-kube-api-access-dc5wv\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732478 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41f9cb0c-1919-4647-bc7f-dfc345c0b6be-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7qdgm\" (UID: \"41f9cb0c-1919-4647-bc7f-dfc345c0b6be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732512 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732564 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-etcd-client\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732593 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca95cb69-0209-4b77-8d05-608c83cdddc2-audit-dir\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732637 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-729cq\" (UniqueName: \"kubernetes.io/projected/7f6980a4-d26b-4132-8da4-650ed74e8a55-kube-api-access-729cq\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732655 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nzhd2\" (UniqueName: \"kubernetes.io/projected/21423ada-b4f2-49f7-9cb7-edf1025fe79e-kube-api-access-nzhd2\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732680 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-95c4w\" (UniqueName: \"kubernetes.io/projected/d8458779-3738-4315-a142-4b5287a2b8fa-kube-api-access-95c4w\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732702 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732717 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-service-ca\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732756 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x5mx2\" (UniqueName: \"kubernetes.io/projected/ccd8432b-f254-450e-9b70-e0e89ead504d-kube-api-access-x5mx2\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732775 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/21423ada-b4f2-49f7-9cb7-edf1025fe79e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732791 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21423ada-b4f2-49f7-9cb7-edf1025fe79e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732812 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-etcd-serving-ca\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732827 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732869 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21423ada-b4f2-49f7-9cb7-edf1025fe79e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732886 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea03f3c-7f0b-4026-82e7-f3bc79397a29-config\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732929 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f275c0-4422-45c9-8d3b-c022a4322df5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732966 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-audit\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.732993 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.733011 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.733026 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-oauth-serving-cert\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.733051 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.733057 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-serving-cert\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.733099 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-console-config\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.737951 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.738019 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-57sr9"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.739219 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.741338 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.743400 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-plj5m"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.750241 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rm7wz"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.754674 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4f2hw"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.763640 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.765073 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.766439 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fxvxq"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.766471 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.768111 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5c747"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.770411 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-8fkq9"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.772334 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.772415 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.773769 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.776017 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7qlmj"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.778910 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.779798 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2mv2d"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.780914 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.781189 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.782388 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8fkq9"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.783746 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.784731 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lwmmp"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.786010 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.787207 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.788512 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42rlr"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.789581 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.790760 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.791825 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.793044 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-machine-config-operator/machine-config-server-qml6c"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.793619 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.794562 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gxcsb"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.795173 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.795942 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gxcsb"] Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.800916 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.821449 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.833466 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cfacc7d-5b16-47f9-8650-50a0810479ab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.833557 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-95c4w\" (UniqueName: \"kubernetes.io/projected/d8458779-3738-4315-a142-4b5287a2b8fa-kube-api-access-95c4w\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.833642 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df192deb-9b42-48e1-86d7-b85b217d6c1e-serving-cert\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.833720 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tvt6l\" (UniqueName: \"kubernetes.io/projected/71bda3a8-4993-48f0-abaf-300a04380ac7-kube-api-access-tvt6l\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.833803 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.833904 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-service-ca\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834013 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df192deb-9b42-48e1-86d7-b85b217d6c1e-trusted-ca\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834106 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-config\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834208 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/21423ada-b4f2-49f7-9cb7-edf1025fe79e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834284 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21423ada-b4f2-49f7-9cb7-edf1025fe79e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834352 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e3da597-67fd-4e7c-8e35-4ef12610beef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834495 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x5mx2\" (UniqueName: \"kubernetes.io/projected/ccd8432b-f254-450e-9b70-e0e89ead504d-kube-api-access-x5mx2\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834580 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9058f87-b187-44c0-b302-712072520e59-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834650 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e460d967-199b-41b2-a198-3acaaa1f4382-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834723 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-config\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834806 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea03f3c-7f0b-4026-82e7-f3bc79397a29-config\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834865 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-service-ca\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834904 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/13d46677-e2af-4751-b6cf-346aca6a8e46-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-d4bjq\" (UID: \"13d46677-e2af-4751-b6cf-346aca6a8e46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.834978 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-etcd-serving-ca\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835006 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835030 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21423ada-b4f2-49f7-9cb7-edf1025fe79e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835058 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f275c0-4422-45c9-8d3b-c022a4322df5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835080 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-serving-cert\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835107 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-audit\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835130 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835153 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835175 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-oauth-serving-cert\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835199 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-console-config\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835226 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9058f87-b187-44c0-b302-712072520e59-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835256 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-config\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835280 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlqc2\" (UniqueName: \"kubernetes.io/projected/a2f275c0-4422-45c9-8d3b-c022a4322df5-kube-api-access-jlqc2\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835285 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/dea03f3c-7f0b-4026-82e7-f3bc79397a29-config\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835305 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fbfnq\" (UniqueName: \"kubernetes.io/projected/9b7264a5-4a23-4af6-b990-c07f4da8d8c5-kube-api-access-fbfnq\") pod \"multus-admission-controller-857f4d67dd-2mv2d\" (UID: \"9b7264a5-4a23-4af6-b990-c07f4da8d8c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835349 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8458779-3738-4315-a142-4b5287a2b8fa-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835636 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-b57ts\" (UniqueName: \"kubernetes.io/projected/dea03f3c-7f0b-4026-82e7-f3bc79397a29-kube-api-access-b57ts\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835708 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-config\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835746 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd8432b-f254-450e-9b70-e0e89ead504d-console-serving-cert\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835806 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-ca\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835869 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2e5ff5-1897-488c-9823-706462fbc903-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.835969 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bdfxq\" (UniqueName: \"kubernetes.io/projected/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-kube-api-access-bdfxq\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.836005 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-etcd-serving-ca\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.836010 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfacc7d-5b16-47f9-8650-50a0810479ab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.836053 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6980a4-d26b-4132-8da4-650ed74e8a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.836817 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/21423ada-b4f2-49f7-9cb7-edf1025fe79e-trusted-ca\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.837392 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-oauth-serving-cert\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.837889 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-config\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.837996 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-serving-cert\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.838011 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-console-config\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.838146 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-audit\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.838261 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c2893c-e678-4c5e-8692-8d50c2510ded-config\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.838313 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mvzz7\" (UniqueName: \"kubernetes.io/projected/e5c2893c-e678-4c5e-8692-8d50c2510ded-kube-api-access-mvzz7\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.838342 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2wlzt\" (UniqueName: \"kubernetes.io/projected/df192deb-9b42-48e1-86d7-b85b217d6c1e-kube-api-access-2wlzt\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.838369 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4b5bdd43-4fae-4142-8905-f30435ac9180-profile-collector-cert\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.839757 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-session\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.839922 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a2f275c0-4422-45c9-8d3b-c022a4322df5-serving-cert\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840158 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-registry-operator-tls\" (UniqueName: \"kubernetes.io/secret/21423ada-b4f2-49f7-9cb7-edf1025fe79e-image-registry-operator-tls\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840473 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6980a4-d26b-4132-8da4-650ed74e8a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840638 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840743 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840821 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840777 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/7f6980a4-d26b-4132-8da4-650ed74e8a55-serving-cert\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.840999 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9058f87-b187-44c0-b302-712072520e59-config\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841100 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e5c2893c-e678-4c5e-8692-8d50c2510ded-config\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841278 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-login\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841405 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-service-ca\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841410 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j57qc\" (UniqueName: \"kubernetes.io/projected/4b5bdd43-4fae-4142-8905-f30435ac9180-kube-api-access-j57qc\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841557 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-encryption-config\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841603 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e5c2893c-e678-4c5e-8692-8d50c2510ded-images\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841735 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bda3a8-4993-48f0-abaf-300a04380ac7-serving-cert\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841828 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2zw7j\" (UniqueName: \"kubernetes.io/projected/41f9cb0c-1919-4647-bc7f-dfc345c0b6be-kube-api-access-2zw7j\") pod \"cluster-samples-operator-665b6dd947-7qdgm\" (UID: \"41f9cb0c-1919-4647-bc7f-dfc345c0b6be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842081 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d8458779-3738-4315-a142-4b5287a2b8fa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842107 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-policies\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842135 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea03f3c-7f0b-4026-82e7-f3bc79397a29-auth-proxy-config\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842152 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-service-ca-bundle\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842170 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fd2t6\" (UniqueName: \"kubernetes.io/projected/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-kube-api-access-fd2t6\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842239 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfacc7d-5b16-47f9-8650-50a0810479ab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842273 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43e00fbb-667f-43f7-b399-31cfcea2ba2f-metrics-tls\") pod \"dns-operator-744455d44c-fxvxq\" (UID: \"43e00fbb-667f-43f7-b399-31cfcea2ba2f\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842293 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lcn8f\" (UniqueName: \"kubernetes.io/projected/9e3da597-67fd-4e7c-8e35-4ef12610beef-kube-api-access-lcn8f\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842319 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/7f6980a4-d26b-4132-8da4-650ed74e8a55-config\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842352 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/e5c2893c-e678-4c5e-8692-8d50c2510ded-images\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842333 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842407 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842406 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca2e5ff5-1897-488c-9823-706462fbc903-serving-cert\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842420 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-serving-cert\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842478 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842505 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5c2893c-e678-4c5e-8692-8d50c2510ded-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842530 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-client-ca\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842552 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-serving-cert\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842571 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-config\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842592 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-service-ca\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842615 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-image-import-ca\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842632 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-trusted-ca-bundle\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842650 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccd8432b-f254-450e-9b70-e0e89ead504d-console-oauth-config\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842667 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kqxxp\" (UniqueName: \"kubernetes.io/projected/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-kube-api-access-kqxxp\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842683 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7pds\" (UniqueName: \"kubernetes.io/projected/13d46677-e2af-4751-b6cf-346aca6a8e46-kube-api-access-l7pds\") pod \"package-server-manager-789f6589d5-d4bjq\" (UID: \"13d46677-e2af-4751-b6cf-346aca6a8e46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842701 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-dir\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842719 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca2e5ff5-1897-488c-9823-706462fbc903-config\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842735 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f275c0-4422-45c9-8d3b-c022a4322df5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842754 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842770 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea03f3c-7f0b-4026-82e7-f3bc79397a29-machine-approver-tls\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842787 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842804 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842809 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-service-ca-bundle\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.841998 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-serving-cert\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842825 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svwxx\" (UniqueName: \"kubernetes.io/projected/43e00fbb-667f-43f7-b399-31cfcea2ba2f-kube-api-access-svwxx\") pod \"dns-operator-744455d44c-fxvxq\" (UID: \"43e00fbb-667f-43f7-b399-31cfcea2ba2f\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842884 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca2e5ff5-1897-488c-9823-706462fbc903-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842908 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dmd6m\" (UniqueName: \"kubernetes.io/projected/e460d967-199b-41b2-a198-3acaaa1f4382-kube-api-access-dmd6m\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842930 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-client-ca\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842948 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e3da597-67fd-4e7c-8e35-4ef12610beef-srv-cert\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842966 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842994 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ca95cb69-0209-4b77-8d05-608c83cdddc2-node-pullsecrets\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843011 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7264a5-4a23-4af6-b990-c07f4da8d8c5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2mv2d\" (UID: \"9b7264a5-4a23-4af6-b990-c07f4da8d8c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843039 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sf62j\" (UniqueName: \"kubernetes.io/projected/ca95cb69-0209-4b77-8d05-608c83cdddc2-kube-api-access-sf62j\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843036 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-config\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843059 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dc5wv\" (UniqueName: \"kubernetes.io/projected/5058ec63-1bc0-4113-b436-041e7e1a37f5-kube-api-access-dc5wv\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843080 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41f9cb0c-1919-4647-bc7f-dfc345c0b6be-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7qdgm\" (UID: \"41f9cb0c-1919-4647-bc7f-dfc345c0b6be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843100 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843118 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df192deb-9b42-48e1-86d7-b85b217d6c1e-config\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843135 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-etcd-client\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843336 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-policies\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843390 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/dea03f3c-7f0b-4026-82e7-f3bc79397a29-auth-proxy-config\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843498 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-pullsecrets\" (UniqueName: \"kubernetes.io/host-path/ca95cb69-0209-4b77-8d05-608c83cdddc2-node-pullsecrets\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843517 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-dir\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843528 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4b5bdd43-4fae-4142-8905-f30435ac9180-srv-cert\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843554 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nzhd2\" (UniqueName: \"kubernetes.io/projected/21423ada-b4f2-49f7-9cb7-edf1025fe79e-kube-api-access-nzhd2\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843574 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-client\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843605 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca95cb69-0209-4b77-8d05-608c83cdddc2-audit-dir\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.843624 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-729cq\" (UniqueName: \"kubernetes.io/projected/7f6980a4-d26b-4132-8da4-650ed74e8a55-kube-api-access-729cq\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.844160 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"available-featuregates\" (UniqueName: \"kubernetes.io/empty-dir/d8458779-3738-4315-a142-4b5287a2b8fa-available-featuregates\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.844588 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-serving-cert\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.844592 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-client-ca\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.845637 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-config\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.845640 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/ccd8432b-f254-450e-9b70-e0e89ead504d-console-serving-cert\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.845705 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-trusted-ca-bundle\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.846041 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.846339 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-etcd-client\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.846345 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d8458779-3738-4315-a142-4b5287a2b8fa-serving-cert\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.846550 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/ca95cb69-0209-4b77-8d05-608c83cdddc2-encryption-config\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.846785 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/43e00fbb-667f-43f7-b399-31cfcea2ba2f-metrics-tls\") pod \"dns-operator-744455d44c-fxvxq\" (UID: \"43e00fbb-667f-43f7-b399-31cfcea2ba2f\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.846933 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"image-import-ca\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-image-import-ca\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.847094 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.847634 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-serving-cert\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.847634 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-proxy-ca-bundles\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.842805 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-cliconfig\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.847702 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/ca95cb69-0209-4b77-8d05-608c83cdddc2-audit-dir\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.848290 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a2f275c0-4422-45c9-8d3b-c022a4322df5-config\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.848372 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ca95cb69-0209-4b77-8d05-608c83cdddc2-trusted-ca-bundle\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.848823 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-api-operator-tls\" (UniqueName: \"kubernetes.io/secret/e5c2893c-e678-4c5e-8692-8d50c2510ded-machine-api-operator-tls\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.848918 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.849061 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/ca2e5ff5-1897-488c-9823-706462fbc903-config\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.849329 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/ccd8432b-f254-450e-9b70-e0e89ead504d-trusted-ca-bundle\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.849890 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/41f9cb0c-1919-4647-bc7f-dfc345c0b6be-samples-operator-tls\") pod \"cluster-samples-operator-665b6dd947-7qdgm\" (UID: \"41f9cb0c-1919-4647-bc7f-dfc345c0b6be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.850226 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"machine-approver-tls\" (UniqueName: \"kubernetes.io/secret/dea03f3c-7f0b-4026-82e7-f3bc79397a29-machine-approver-tls\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.850444 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-error\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.851203 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-serving-cert\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.851945 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/ccd8432b-f254-450e-9b70-e0e89ead504d-console-oauth-config\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.854754 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-router-certs\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.857926 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.862430 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.882046 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.901749 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.922284 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.943260 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944326 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-service-ca\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944403 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l7pds\" (UniqueName: \"kubernetes.io/projected/13d46677-e2af-4751-b6cf-346aca6a8e46-kube-api-access-l7pds\") pod \"package-server-manager-789f6589d5-d4bjq\" (UID: \"13d46677-e2af-4751-b6cf-346aca6a8e46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944469 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e3da597-67fd-4e7c-8e35-4ef12610beef-srv-cert\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944507 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dmd6m\" (UniqueName: \"kubernetes.io/projected/e460d967-199b-41b2-a198-3acaaa1f4382-kube-api-access-dmd6m\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944589 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7264a5-4a23-4af6-b990-c07f4da8d8c5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2mv2d\" (UID: \"9b7264a5-4a23-4af6-b990-c07f4da8d8c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944659 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df192deb-9b42-48e1-86d7-b85b217d6c1e-config\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944695 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4b5bdd43-4fae-4142-8905-f30435ac9180-srv-cert\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944746 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-client\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944790 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df192deb-9b42-48e1-86d7-b85b217d6c1e-serving-cert\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944826 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cfacc7d-5b16-47f9-8650-50a0810479ab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944886 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df192deb-9b42-48e1-86d7-b85b217d6c1e-trusted-ca\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944918 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-config\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.944949 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tvt6l\" (UniqueName: \"kubernetes.io/projected/71bda3a8-4993-48f0-abaf-300a04380ac7-kube-api-access-tvt6l\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945002 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e3da597-67fd-4e7c-8e35-4ef12610beef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945036 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9058f87-b187-44c0-b302-712072520e59-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945075 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e460d967-199b-41b2-a198-3acaaa1f4382-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945111 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-config\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945148 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/13d46677-e2af-4751-b6cf-346aca6a8e46-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-d4bjq\" (UID: \"13d46677-e2af-4751-b6cf-346aca6a8e46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945189 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9058f87-b187-44c0-b302-712072520e59-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945231 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fbfnq\" (UniqueName: \"kubernetes.io/projected/9b7264a5-4a23-4af6-b990-c07f4da8d8c5-kube-api-access-fbfnq\") pod \"multus-admission-controller-857f4d67dd-2mv2d\" (UID: \"9b7264a5-4a23-4af6-b990-c07f4da8d8c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945283 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-ca\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945318 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bdfxq\" (UniqueName: \"kubernetes.io/projected/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-kube-api-access-bdfxq\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945368 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfacc7d-5b16-47f9-8650-50a0810479ab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945420 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2wlzt\" (UniqueName: \"kubernetes.io/projected/df192deb-9b42-48e1-86d7-b85b217d6c1e-kube-api-access-2wlzt\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945431 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/df192deb-9b42-48e1-86d7-b85b217d6c1e-config\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945451 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4b5bdd43-4fae-4142-8905-f30435ac9180-profile-collector-cert\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945485 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9058f87-b187-44c0-b302-712072520e59-config\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945522 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j57qc\" (UniqueName: \"kubernetes.io/projected/4b5bdd43-4fae-4142-8905-f30435ac9180-kube-api-access-j57qc\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945558 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bda3a8-4993-48f0-abaf-300a04380ac7-serving-cert\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945630 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfacc7d-5b16-47f9-8650-50a0810479ab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945697 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lcn8f\" (UniqueName: \"kubernetes.io/projected/9e3da597-67fd-4e7c-8e35-4ef12610beef-kube-api-access-lcn8f\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945729 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-serving-cert\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.945760 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-client-ca\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.946415 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f9058f87-b187-44c0-b302-712072520e59-config\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.946705 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/df192deb-9b42-48e1-86d7-b85b217d6c1e-trusted-ca\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.948246 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/df192deb-9b42-48e1-86d7-b85b217d6c1e-serving-cert\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.949189 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f9058f87-b187-44c0-b302-712072520e59-serving-cert\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.961398 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 04:28:50 crc kubenswrapper[4812]: I0131 04:28:50.981384 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.001638 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.022551 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.041828 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.062071 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.082377 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.102971 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.123781 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.127338 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-ca\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-ca\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.141626 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.146199 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-service-ca\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-service-ca\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.161729 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.182022 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.201768 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.212181 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-serving-cert\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.223095 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.242185 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.249696 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-etcd-client\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.262026 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.271693 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6cfacc7d-5b16-47f9-8650-50a0810479ab-serving-cert\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.282393 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.287211 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6cfacc7d-5b16-47f9-8650-50a0810479ab-config\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.302151 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.306382 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-config\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.321653 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.342520 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.351995 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bda3a8-4993-48f0-abaf-300a04380ac7-serving-cert\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.362372 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.383038 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.387893 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-config\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.402193 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.407648 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-client-ca\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.422495 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.442802 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.482376 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.502379 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.522716 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.543050 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.562918 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.571948 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/9e3da597-67fd-4e7c-8e35-4ef12610beef-profile-collector-cert\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.573117 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"profile-collector-cert\" (UniqueName: \"kubernetes.io/secret/4b5bdd43-4fae-4142-8905-f30435ac9180-profile-collector-cert\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.581965 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.602570 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.623943 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.630063 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/4b5bdd43-4fae-4142-8905-f30435ac9180-srv-cert\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.643100 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.651461 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"package-server-manager-serving-cert\" (UniqueName: \"kubernetes.io/secret/13d46677-e2af-4751-b6cf-346aca6a8e46-package-server-manager-serving-cert\") pod \"package-server-manager-789f6589d5-d4bjq\" (UID: \"13d46677-e2af-4751-b6cf-346aca6a8e46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.662336 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.668728 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-certs\" (UniqueName: \"kubernetes.io/secret/9b7264a5-4a23-4af6-b990-c07f4da8d8c5-webhook-certs\") pod \"multus-admission-controller-857f4d67dd-2mv2d\" (UID: \"9b7264a5-4a23-4af6-b990-c07f4da8d8c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.682640 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.702230 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.719833 4812 request.go:700] Waited for 1.010661788s due to client-side throttling, not priority and fairness, request: GET:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-operator-lifecycle-manager/secrets?fieldSelector=metadata.name%3Dcollect-profiles-dockercfg-kzf4t&limit=500&resourceVersion=0 Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.722196 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.742314 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.763361 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.782904 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.802381 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.822669 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.842347 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.862143 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.894932 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.902371 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.921814 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.942550 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: E0131 04:28:51.945767 4812 secret.go:188] Couldn't get secret openshift-machine-api/control-plane-machine-set-operator-tls: failed to sync secret cache: timed out waiting for the condition Jan 31 04:28:51 crc kubenswrapper[4812]: E0131 04:28:51.945932 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e460d967-199b-41b2-a198-3acaaa1f4382-control-plane-machine-set-operator-tls podName:e460d967-199b-41b2-a198-3acaaa1f4382 nodeName:}" failed. No retries permitted until 2026-01-31 04:28:52.445899008 +0000 UTC m=+140.940920703 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "control-plane-machine-set-operator-tls" (UniqueName: "kubernetes.io/secret/e460d967-199b-41b2-a198-3acaaa1f4382-control-plane-machine-set-operator-tls") pod "control-plane-machine-set-operator-78cbb6b69f-qlf2j" (UID: "e460d967-199b-41b2-a198-3acaaa1f4382") : failed to sync secret cache: timed out waiting for the condition Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.950191 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"srv-cert\" (UniqueName: \"kubernetes.io/secret/9e3da597-67fd-4e7c-8e35-4ef12610beef-srv-cert\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.963107 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 04:28:51 crc kubenswrapper[4812]: I0131 04:28:51.982246 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.003532 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.022133 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.042892 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.063088 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.082944 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.102803 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.123065 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.142579 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.163067 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.194291 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.203441 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.222559 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.243019 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.262088 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.283035 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.302559 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.322366 4812 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.342365 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.363667 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.402937 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.422780 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.442822 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.462547 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.472487 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e460d967-199b-41b2-a198-3acaaa1f4382-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.477544 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"control-plane-machine-set-operator-tls\" (UniqueName: \"kubernetes.io/secret/e460d967-199b-41b2-a198-3acaaa1f4382-control-plane-machine-set-operator-tls\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.483257 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.502741 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.522421 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.542620 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.562440 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.582576 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.629958 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-95c4w\" (UniqueName: \"kubernetes.io/projected/d8458779-3738-4315-a142-4b5287a2b8fa-kube-api-access-95c4w\") pod \"openshift-config-operator-7777fb866f-qcsf7\" (UID: \"d8458779-3738-4315-a142-4b5287a2b8fa\") " pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.642351 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/21423ada-b4f2-49f7-9cb7-edf1025fe79e-bound-sa-token\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.670672 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x5mx2\" (UniqueName: \"kubernetes.io/projected/ccd8432b-f254-450e-9b70-e0e89ead504d-kube-api-access-x5mx2\") pod \"console-f9d7485db-4f2hw\" (UID: \"ccd8432b-f254-450e-9b70-e0e89ead504d\") " pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.689057 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-b57ts\" (UniqueName: \"kubernetes.io/projected/dea03f3c-7f0b-4026-82e7-f3bc79397a29-kube-api-access-b57ts\") pod \"machine-approver-56656f9798-bf26c\" (UID: \"dea03f3c-7f0b-4026-82e7-f3bc79397a29\") " pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.708277 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlqc2\" (UniqueName: \"kubernetes.io/projected/a2f275c0-4422-45c9-8d3b-c022a4322df5-kube-api-access-jlqc2\") pod \"openshift-controller-manager-operator-756b6f6bc6-59zlj\" (UID: \"a2f275c0-4422-45c9-8d3b-c022a4322df5\") " pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.721144 4812 request.go:700] Waited for 1.879064191s due to client-side throttling, not priority and fairness, request: POST:https://api-int.crc.testing:6443/api/v1/namespaces/openshift-cluster-samples-operator/serviceaccounts/cluster-samples-operator/token Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.730929 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mvzz7\" (UniqueName: \"kubernetes.io/projected/e5c2893c-e678-4c5e-8692-8d50c2510ded-kube-api-access-mvzz7\") pod \"machine-api-operator-5694c8668f-rm7wz\" (UID: \"e5c2893c-e678-4c5e-8692-8d50c2510ded\") " pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.751640 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2zw7j\" (UniqueName: \"kubernetes.io/projected/41f9cb0c-1919-4647-bc7f-dfc345c0b6be-kube-api-access-2zw7j\") pod \"cluster-samples-operator-665b6dd947-7qdgm\" (UID: \"41f9cb0c-1919-4647-bc7f-dfc345c0b6be\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.761296 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fd2t6\" (UniqueName: \"kubernetes.io/projected/c4261f6c-8fb6-4c68-9fab-5c2f46afcca8-kube-api-access-fd2t6\") pod \"authentication-operator-69f744f599-bw8rt\" (UID: \"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8\") " pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.780744 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.795330 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svwxx\" (UniqueName: \"kubernetes.io/projected/43e00fbb-667f-43f7-b399-31cfcea2ba2f-kube-api-access-svwxx\") pod \"dns-operator-744455d44c-fxvxq\" (UID: \"43e00fbb-667f-43f7-b399-31cfcea2ba2f\") " pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.815493 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dc5wv\" (UniqueName: \"kubernetes.io/projected/5058ec63-1bc0-4113-b436-041e7e1a37f5-kube-api-access-dc5wv\") pod \"oauth-openshift-558db77b4-57sr9\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.819023 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.823997 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/ca2e5ff5-1897-488c-9823-706462fbc903-kube-api-access\") pod \"kube-apiserver-operator-766d6c64bb-2twh6\" (UID: \"ca2e5ff5-1897-488c-9823-706462fbc903\") " pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.839593 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.845455 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.850429 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sf62j\" (UniqueName: \"kubernetes.io/projected/ca95cb69-0209-4b77-8d05-608c83cdddc2-kube-api-access-sf62j\") pod \"apiserver-76f77b778f-tt2tp\" (UID: \"ca95cb69-0209-4b77-8d05-608c83cdddc2\") " pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.855831 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.872177 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nzhd2\" (UniqueName: \"kubernetes.io/projected/21423ada-b4f2-49f7-9cb7-edf1025fe79e-kube-api-access-nzhd2\") pod \"cluster-image-registry-operator-dc59b4c8b-lfhhx\" (UID: \"21423ada-b4f2-49f7-9cb7-edf1025fe79e\") " pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.875441 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.882259 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.885753 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqxxp\" (UniqueName: \"kubernetes.io/projected/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-kube-api-access-kqxxp\") pod \"controller-manager-879f6c89f-6246k\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.886462 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.911046 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-729cq\" (UniqueName: \"kubernetes.io/projected/7f6980a4-d26b-4132-8da4-650ed74e8a55-kube-api-access-729cq\") pod \"openshift-apiserver-operator-796bbdcf4f-lqsm5\" (UID: \"7f6980a4-d26b-4132-8da4-650ed74e8a55\") " pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.921253 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dmd6m\" (UniqueName: \"kubernetes.io/projected/e460d967-199b-41b2-a198-3acaaa1f4382-kube-api-access-dmd6m\") pod \"control-plane-machine-set-operator-78cbb6b69f-qlf2j\" (UID: \"e460d967-199b-41b2-a198-3acaaa1f4382\") " pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.938474 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7pds\" (UniqueName: \"kubernetes.io/projected/13d46677-e2af-4751-b6cf-346aca6a8e46-kube-api-access-l7pds\") pod \"package-server-manager-789f6589d5-d4bjq\" (UID: \"13d46677-e2af-4751-b6cf-346aca6a8e46\") " pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.952775 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.962762 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/6cfacc7d-5b16-47f9-8650-50a0810479ab-kube-api-access\") pod \"openshift-kube-scheduler-operator-5fdd9b5758-n6qxf\" (UID: \"6cfacc7d-5b16-47f9-8650-50a0810479ab\") " pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.977470 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tvt6l\" (UniqueName: \"kubernetes.io/projected/71bda3a8-4993-48f0-abaf-300a04380ac7-kube-api-access-tvt6l\") pod \"route-controller-manager-6576b87f9c-wb7l5\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.981461 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.996610 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" Jan 31 04:28:52 crc kubenswrapper[4812]: I0131 04:28:52.996655 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.014157 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.014453 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/f9058f87-b187-44c0-b302-712072520e59-kube-api-access\") pod \"kube-controller-manager-operator-78b949d7b-vc7bf\" (UID: \"f9058f87-b187-44c0-b302-712072520e59\") " pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.017565 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2wlzt\" (UniqueName: \"kubernetes.io/projected/df192deb-9b42-48e1-86d7-b85b217d6c1e-kube-api-access-2wlzt\") pod \"console-operator-58897d9998-b2n85\" (UID: \"df192deb-9b42-48e1-86d7-b85b217d6c1e\") " pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.031637 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.038641 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.041007 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fbfnq\" (UniqueName: \"kubernetes.io/projected/9b7264a5-4a23-4af6-b990-c07f4da8d8c5-kube-api-access-fbfnq\") pod \"multus-admission-controller-857f4d67dd-2mv2d\" (UID: \"9b7264a5-4a23-4af6-b990-c07f4da8d8c5\") " pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.058237 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.060408 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bdfxq\" (UniqueName: \"kubernetes.io/projected/4526610c-79f9-4cbc-b67f-9e2c1149f7b2-kube-api-access-bdfxq\") pod \"etcd-operator-b45778765-42rlr\" (UID: \"4526610c-79f9-4cbc-b67f-9e2c1149f7b2\") " pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.062090 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" event={"ID":"dea03f3c-7f0b-4026-82e7-f3bc79397a29","Type":"ContainerStarted","Data":"10bf0520e65c46a32079c138f9d760816a28cfaefff9da203a9ab07b25ffdede"} Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.080065 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lcn8f\" (UniqueName: \"kubernetes.io/projected/9e3da597-67fd-4e7c-8e35-4ef12610beef-kube-api-access-lcn8f\") pod \"olm-operator-6b444d44fb-lhxxt\" (UID: \"9e3da597-67fd-4e7c-8e35-4ef12610beef\") " pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.082429 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.094679 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.096098 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j57qc\" (UniqueName: \"kubernetes.io/projected/4b5bdd43-4fae-4142-8905-f30435ac9180-kube-api-access-j57qc\") pod \"catalog-operator-68c6474976-j2ppp\" (UID: \"4b5bdd43-4fae-4142-8905-f30435ac9180\") " pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.107631 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.108176 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.130124 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/machine-api-operator-5694c8668f-rm7wz"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.167964 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.169528 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns-operator/dns-operator-744455d44c-fxvxq"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184138 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/094a78ab-06ca-4e51-aae3-577a1ee80df5-proxy-tls\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184202 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-mountpoint-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184239 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv7fr\" (UniqueName: \"kubernetes.io/projected/1d5723db-1696-4fe1-a736-756e9bf39115-kube-api-access-bv7fr\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184258 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-trusted-ca\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184274 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184287 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/094a78ab-06ca-4e51-aae3-577a1ee80df5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184311 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ab29b7-5bdf-4727-999f-8a0a9f104374-audit-dir\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184326 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-registration-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184342 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e41729f9-a81f-47cb-895c-c7500855a522-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184360 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68cca81f-4d4d-4a38-8a6e-ba856a013888-webhook-cert\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184375 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c15692-b01e-415b-87e3-80184b4551f6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184388 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-serving-cert\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184404 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184424 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n89w7\" (UniqueName: \"kubernetes.io/projected/62e8a091-0ee4-4526-837c-d80a77d2c233-kube-api-access-n89w7\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184438 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d5723db-1696-4fe1-a736-756e9bf39115-config-volume\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184454 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/954336e2-74fe-443b-9bef-1247a8935c13-service-ca-bundle\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184470 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-audit-policies\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184485 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-metrics-tls\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184531 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a2bf1dee-41d8-4797-ae33-0e659438727b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184564 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-metrics-certs\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184580 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpt9z\" (UniqueName: \"kubernetes.io/projected/f3ab29b7-5bdf-4727-999f-8a0a9f104374-kube-api-access-bpt9z\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184596 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e8a091-0ee4-4526-837c-d80a77d2c233-config\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184611 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-tls\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184627 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a2bf1dee-41d8-4797-ae33-0e659438727b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184670 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktrvs\" (UniqueName: \"kubernetes.io/projected/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-kube-api-access-ktrvs\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184700 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/094a78ab-06ca-4e51-aae3-577a1ee80df5-images\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184716 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-default-certificate\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184738 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9pnx\" (UniqueName: \"kubernetes.io/projected/b6b2d935-b2ef-4444-9c39-2c91695b9765-kube-api-access-g9pnx\") pod \"downloads-7954f5f757-plj5m\" (UID: \"b6b2d935-b2ef-4444-9c39-2c91695b9765\") " pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184753 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e41729f9-a81f-47cb-895c-c7500855a522-proxy-tls\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184767 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pr8sw\" (UniqueName: \"kubernetes.io/projected/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-kube-api-access-pr8sw\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184783 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5wbpj\" (UniqueName: \"kubernetes.io/projected/094a78ab-06ca-4e51-aae3-577a1ee80df5-kube-api-access-5wbpj\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184797 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-socket-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184822 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-certificates\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184850 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98c15692-b01e-415b-87e3-80184b4551f6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184875 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tghnd\" (UniqueName: \"kubernetes.io/projected/68cca81f-4d4d-4a38-8a6e-ba856a013888-kube-api-access-tghnd\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184889 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-trusted-ca\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184914 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-signing-cabundle\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184934 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184951 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9gkqj\" (UniqueName: \"kubernetes.io/projected/b7936c5a-60b1-4747-bdee-8de1c2952aa7-kube-api-access-9gkqj\") pod \"migrator-59844c95c7-zwdl9\" (UID: \"b7936c5a-60b1-4747-bdee-8de1c2952aa7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184968 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.184982 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wljms\" (UniqueName: \"kubernetes.io/projected/88240dfe-5946-4e69-99e3-2a429675a53f-kube-api-access-wljms\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185031 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-signing-key\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185055 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185070 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-stats-auth\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185085 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185130 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e8a091-0ee4-4526-837c-d80a77d2c233-serving-cert\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185146 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-encryption-config\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185162 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-plugins-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185176 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-csi-data-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185222 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-etcd-client\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185248 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gf48\" (UniqueName: \"kubernetes.io/projected/e41729f9-a81f-47cb-895c-c7500855a522-kube-api-access-7gf48\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185293 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/68cca81f-4d4d-4a38-8a6e-ba856a013888-tmpfs\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185309 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4g9lm\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-kube-api-access-4g9lm\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185325 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d5723db-1696-4fe1-a736-756e9bf39115-secret-volume\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185341 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-484kx\" (UniqueName: \"kubernetes.io/projected/954336e2-74fe-443b-9bef-1247a8935c13-kube-api-access-484kx\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185365 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hdk7w\" (UniqueName: \"kubernetes.io/projected/98c15692-b01e-415b-87e3-80184b4551f6-kube-api-access-hdk7w\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185382 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68cca81f-4d4d-4a38-8a6e-ba856a013888-apiservice-cert\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185398 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-br6rt\" (UniqueName: \"kubernetes.io/projected/39f52f71-fcee-4193-95db-158c8fe2f71f-kube-api-access-br6rt\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.185413 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-bound-sa-token\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.186769 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:53.686751118 +0000 UTC m=+142.181772783 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.192708 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-f9d7485db-4f2hw"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.205119 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.223657 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.284884 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.286268 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.286531 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:53.786506576 +0000 UTC m=+142.281528241 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.286626 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/094a78ab-06ca-4e51-aae3-577a1ee80df5-proxy-tls\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.286650 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-mountpoint-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.286677 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv7fr\" (UniqueName: \"kubernetes.io/projected/1d5723db-1696-4fe1-a736-756e9bf39115-kube-api-access-bv7fr\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287014 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-trusted-ca\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287040 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287060 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/094a78ab-06ca-4e51-aae3-577a1ee80df5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287066 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mountpoint-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-mountpoint-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287080 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-registration-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287110 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ab29b7-5bdf-4727-999f-8a0a9f104374-audit-dir\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287129 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e41729f9-a81f-47cb-895c-c7500855a522-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287146 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68cca81f-4d4d-4a38-8a6e-ba856a013888-webhook-cert\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287162 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c15692-b01e-415b-87e3-80184b4551f6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287180 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-certs\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287212 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-serving-cert\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287243 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287263 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d5723db-1696-4fe1-a736-756e9bf39115-config-volume\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287301 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n89w7\" (UniqueName: \"kubernetes.io/projected/62e8a091-0ee4-4526-837c-d80a77d2c233-kube-api-access-n89w7\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287318 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/954336e2-74fe-443b-9bef-1247a8935c13-service-ca-bundle\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287346 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4rzsg\" (UniqueName: \"kubernetes.io/projected/2aaf45c7-302a-433d-b9dc-c22d6a978311-kube-api-access-4rzsg\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287370 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-audit-policies\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287386 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-metrics-tls\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287400 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf45c7-302a-433d-b9dc-c22d6a978311-config-volume\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287421 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a2bf1dee-41d8-4797-ae33-0e659438727b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287475 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpt9z\" (UniqueName: \"kubernetes.io/projected/f3ab29b7-5bdf-4727-999f-8a0a9f104374-kube-api-access-bpt9z\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287492 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-metrics-certs\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287526 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e8a091-0ee4-4526-837c-d80a77d2c233-config\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287551 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-tls\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287567 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2aaf45c7-302a-433d-b9dc-c22d6a978311-metrics-tls\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287603 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a2bf1dee-41d8-4797-ae33-0e659438727b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287618 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-node-bootstrap-token\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287637 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hnsfn\" (UniqueName: \"kubernetes.io/projected/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-kube-api-access-hnsfn\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287665 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ktrvs\" (UniqueName: \"kubernetes.io/projected/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-kube-api-access-ktrvs\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287694 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"images\" (UniqueName: \"kubernetes.io/configmap/094a78ab-06ca-4e51-aae3-577a1ee80df5-images\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287708 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-default-certificate\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287734 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9pnx\" (UniqueName: \"kubernetes.io/projected/b6b2d935-b2ef-4444-9c39-2c91695b9765-kube-api-access-g9pnx\") pod \"downloads-7954f5f757-plj5m\" (UID: \"b6b2d935-b2ef-4444-9c39-2c91695b9765\") " pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287750 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e41729f9-a81f-47cb-895c-c7500855a522-proxy-tls\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287765 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pr8sw\" (UniqueName: \"kubernetes.io/projected/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-kube-api-access-pr8sw\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287781 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5wbpj\" (UniqueName: \"kubernetes.io/projected/094a78ab-06ca-4e51-aae3-577a1ee80df5-kube-api-access-5wbpj\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.287810 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-socket-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288324 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98c15692-b01e-415b-87e3-80184b4551f6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288359 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-certificates\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288407 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tghnd\" (UniqueName: \"kubernetes.io/projected/68cca81f-4d4d-4a38-8a6e-ba856a013888-kube-api-access-tghnd\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288426 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-trusted-ca\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288451 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-signing-cabundle\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288470 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288531 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9gkqj\" (UniqueName: \"kubernetes.io/projected/b7936c5a-60b1-4747-bdee-8de1c2952aa7-kube-api-access-9gkqj\") pod \"migrator-59844c95c7-zwdl9\" (UID: \"b7936c5a-60b1-4747-bdee-8de1c2952aa7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.288547 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wljms\" (UniqueName: \"kubernetes.io/projected/88240dfe-5946-4e69-99e3-2a429675a53f-kube-api-access-wljms\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289108 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289127 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289143 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-stats-auth\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289159 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-signing-key\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289208 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289227 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xszmb\" (UniqueName: \"kubernetes.io/projected/4f6c8609-8788-4693-9d93-babd560187f1-kube-api-access-xszmb\") pod \"ingress-canary-gxcsb\" (UID: \"4f6c8609-8788-4693-9d93-babd560187f1\") " pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289261 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e8a091-0ee4-4526-837c-d80a77d2c233-serving-cert\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289296 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-encryption-config\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.290866 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/98c15692-b01e-415b-87e3-80184b4551f6-config\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.289365 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-plugins-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291028 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"mcc-auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/e41729f9-a81f-47cb-895c-c7500855a522-mcc-auth-proxy-config\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291120 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-csi-data-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291130 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-plugins-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291151 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-etcd-client\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291200 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7gf48\" (UniqueName: \"kubernetes.io/projected/e41729f9-a81f-47cb-895c-c7500855a522-kube-api-access-7gf48\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291272 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4g9lm\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-kube-api-access-4g9lm\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291302 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/68cca81f-4d4d-4a38-8a6e-ba856a013888-tmpfs\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291344 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d5723db-1696-4fe1-a736-756e9bf39115-secret-volume\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291349 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-trusted-ca-bundle\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291365 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6c8609-8788-4693-9d93-babd560187f1-cert\") pod \"ingress-canary-gxcsb\" (UID: \"4f6c8609-8788-4693-9d93-babd560187f1\") " pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291414 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-484kx\" (UniqueName: \"kubernetes.io/projected/954336e2-74fe-443b-9bef-1247a8935c13-kube-api-access-484kx\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.291909 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"auth-proxy-config\" (UniqueName: \"kubernetes.io/configmap/094a78ab-06ca-4e51-aae3-577a1ee80df5-auth-proxy-config\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.293070 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hdk7w\" (UniqueName: \"kubernetes.io/projected/98c15692-b01e-415b-87e3-80184b4551f6-kube-api-access-hdk7w\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.293116 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-br6rt\" (UniqueName: \"kubernetes.io/projected/39f52f71-fcee-4193-95db-158c8fe2f71f-kube-api-access-br6rt\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.293140 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68cca81f-4d4d-4a38-8a6e-ba856a013888-apiservice-cert\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.293193 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-bound-sa-token\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.294495 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-socket-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.296212 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:53.79619784 +0000 UTC m=+142.291219505 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.296442 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f3ab29b7-5bdf-4727-999f-8a0a9f104374-audit-dir\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.296286 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/62e8a091-0ee4-4526-837c-d80a77d2c233-config\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.297519 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/094a78ab-06ca-4e51-aae3-577a1ee80df5-proxy-tls\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.297996 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/954336e2-74fe-443b-9bef-1247a8935c13-service-ca-bundle\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.299008 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-serving-ca\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-etcd-serving-ca\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.299648 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etcd-client\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-etcd-client\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.300067 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-trusted-ca\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.300551 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a2bf1dee-41d8-4797-ae33-0e659438727b-ca-trust-extracted\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.301969 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"images\" (UniqueName: \"kubernetes.io/configmap/094a78ab-06ca-4e51-aae3-577a1ee80df5-images\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.302514 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/68cca81f-4d4d-4a38-8a6e-ba856a013888-webhook-cert\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.302721 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.303244 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"tmpfs\" (UniqueName: \"kubernetes.io/empty-dir/68cca81f-4d4d-4a38-8a6e-ba856a013888-tmpfs\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.303267 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-signing-cabundle\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.304156 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-registration-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.304602 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"csi-data-dir\" (UniqueName: \"kubernetes.io/host-path/88240dfe-5946-4e69-99e3-2a429675a53f-csi-data-dir\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.305210 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/f3ab29b7-5bdf-4727-999f-8a0a9f104374-audit-policies\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.305651 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d5723db-1696-4fe1-a736-756e9bf39115-config-volume\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.307065 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-certificates\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.307765 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-metrics-tls\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.308072 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/98c15692-b01e-415b-87e3-80184b4551f6-serving-cert\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.309125 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-trusted-ca\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.309247 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/62e8a091-0ee4-4526-837c-d80a77d2c233-serving-cert\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.316341 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-tls\" (UniqueName: \"kubernetes.io/secret/e41729f9-a81f-47cb-895c-c7500855a522-proxy-tls\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.316804 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-signing-key\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.318507 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-tls\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.318572 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.320918 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-default-certificate\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.321719 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"encryption-config\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-encryption-config\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322001 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d5723db-1696-4fe1-a736-756e9bf39115-secret-volume\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322258 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/68cca81f-4d4d-4a38-8a6e-ba856a013888-apiservice-cert\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322505 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322518 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-metrics-certs\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322560 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a2bf1dee-41d8-4797-ae33-0e659438727b-installation-pull-secrets\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322689 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/954336e2-74fe-443b-9bef-1247a8935c13-stats-auth\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.322767 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f3ab29b7-5bdf-4727-999f-8a0a9f104374-serving-cert\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.338832 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv7fr\" (UniqueName: \"kubernetes.io/projected/1d5723db-1696-4fe1-a736-756e9bf39115-kube-api-access-bv7fr\") pod \"collect-profiles-29497215-pf4tb\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.339093 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.355110 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-484kx\" (UniqueName: \"kubernetes.io/projected/954336e2-74fe-443b-9bef-1247a8935c13-kube-api-access-484kx\") pod \"router-default-5444994796-2hcvc\" (UID: \"954336e2-74fe-443b-9bef-1247a8935c13\") " pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.378167 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pr8sw\" (UniqueName: \"kubernetes.io/projected/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-kube-api-access-pr8sw\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395281 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395422 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4rzsg\" (UniqueName: \"kubernetes.io/projected/2aaf45c7-302a-433d-b9dc-c22d6a978311-kube-api-access-4rzsg\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395449 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf45c7-302a-433d-b9dc-c22d6a978311-config-volume\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395483 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2aaf45c7-302a-433d-b9dc-c22d6a978311-metrics-tls\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395499 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-node-bootstrap-token\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395515 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hnsfn\" (UniqueName: \"kubernetes.io/projected/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-kube-api-access-hnsfn\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395585 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xszmb\" (UniqueName: \"kubernetes.io/projected/4f6c8609-8788-4693-9d93-babd560187f1-kube-api-access-xszmb\") pod \"ingress-canary-gxcsb\" (UID: \"4f6c8609-8788-4693-9d93-babd560187f1\") " pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395629 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6c8609-8788-4693-9d93-babd560187f1-cert\") pod \"ingress-canary-gxcsb\" (UID: \"4f6c8609-8788-4693-9d93-babd560187f1\") " pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.395678 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"certs\" (UniqueName: \"kubernetes.io/secret/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-certs\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.397261 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5wbpj\" (UniqueName: \"kubernetes.io/projected/094a78ab-06ca-4e51-aae3-577a1ee80df5-kube-api-access-5wbpj\") pod \"machine-config-operator-74547568cd-jnqk4\" (UID: \"094a78ab-06ca-4e51-aae3-577a1ee80df5\") " pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.398380 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:53.898360704 +0000 UTC m=+142.393382369 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.399308 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/2aaf45c7-302a-433d-b9dc-c22d6a978311-config-volume\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.399924 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/2aaf45c7-302a-433d-b9dc-c22d6a978311-metrics-tls\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.400704 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"certs\" (UniqueName: \"kubernetes.io/secret/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-certs\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.404117 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/4f6c8609-8788-4693-9d93-babd560187f1-cert\") pod \"ingress-canary-gxcsb\" (UID: \"4f6c8609-8788-4693-9d93-babd560187f1\") " pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.412033 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-bootstrap-token\" (UniqueName: \"kubernetes.io/secret/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-node-bootstrap-token\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.428296 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.430136 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpt9z\" (UniqueName: \"kubernetes.io/projected/f3ab29b7-5bdf-4727-999f-8a0a9f104374-kube-api-access-bpt9z\") pod \"apiserver-7bbb656c7d-5gx6v\" (UID: \"f3ab29b7-5bdf-4727-999f-8a0a9f104374\") " pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.445675 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktrvs\" (UniqueName: \"kubernetes.io/projected/055e189f-0506-4504-8eb0-6bfe3e9ec9e1-kube-api-access-ktrvs\") pod \"service-ca-9c57cc56f-7qlmj\" (UID: \"055e189f-0506-4504-8eb0-6bfe3e9ec9e1\") " pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.463191 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4g9lm\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-kube-api-access-4g9lm\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.486528 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gf48\" (UniqueName: \"kubernetes.io/projected/e41729f9-a81f-47cb-895c-c7500855a522-kube-api-access-7gf48\") pod \"machine-config-controller-84d6567774-bvhp8\" (UID: \"e41729f9-a81f-47cb-895c-c7500855a522\") " pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.496817 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.497223 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:53.997206158 +0000 UTC m=+142.492227823 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.499307 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9pnx\" (UniqueName: \"kubernetes.io/projected/b6b2d935-b2ef-4444-9c39-2c91695b9765-kube-api-access-g9pnx\") pod \"downloads-7954f5f757-plj5m\" (UID: \"b6b2d935-b2ef-4444-9c39-2c91695b9765\") " pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.513236 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.519164 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/0264ec2c-4f75-4ef5-9e27-f4f706275a0f-bound-sa-token\") pod \"ingress-operator-5b745b69d9-qv56p\" (UID: \"0264ec2c-4f75-4ef5-9e27-f4f706275a0f\") " pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.536576 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-br6rt\" (UniqueName: \"kubernetes.io/projected/39f52f71-fcee-4193-95db-158c8fe2f71f-kube-api-access-br6rt\") pod \"marketplace-operator-79b997595-wh59s\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.536606 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.556225 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wljms\" (UniqueName: \"kubernetes.io/projected/88240dfe-5946-4e69-99e3-2a429675a53f-kube-api-access-wljms\") pod \"csi-hostpathplugin-lwmmp\" (UID: \"88240dfe-5946-4e69-99e3-2a429675a53f\") " pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.591647 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hdk7w\" (UniqueName: \"kubernetes.io/projected/98c15692-b01e-415b-87e3-80184b4551f6-kube-api-access-hdk7w\") pod \"kube-storage-version-migrator-operator-b67b599dd-8mhvg\" (UID: \"98c15692-b01e-415b-87e3-80184b4551f6\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.598200 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.598591 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.09857479 +0000 UTC m=+142.593596455 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.601682 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.606241 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-bound-sa-token\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.614539 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.622184 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9gkqj\" (UniqueName: \"kubernetes.io/projected/b7936c5a-60b1-4747-bdee-8de1c2952aa7-kube-api-access-9gkqj\") pod \"migrator-59844c95c7-zwdl9\" (UID: \"b7936c5a-60b1-4747-bdee-8de1c2952aa7\") " pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.640533 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tghnd\" (UniqueName: \"kubernetes.io/projected/68cca81f-4d4d-4a38-8a6e-ba856a013888-kube-api-access-tghnd\") pod \"packageserver-d55dfcdfc-bpzbl\" (UID: \"68cca81f-4d4d-4a38-8a6e-ba856a013888\") " pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.646958 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.658338 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n89w7\" (UniqueName: \"kubernetes.io/projected/62e8a091-0ee4-4526-837c-d80a77d2c233-kube-api-access-n89w7\") pod \"service-ca-operator-777779d784-sk6pl\" (UID: \"62e8a091-0ee4-4526-837c-d80a77d2c233\") " pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.664300 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.672639 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.688376 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.696700 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.703003 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.703366 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.203350115 +0000 UTC m=+142.698371780 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.712089 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.705594 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hnsfn\" (UniqueName: \"kubernetes.io/projected/aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42-kube-api-access-hnsfn\") pod \"machine-config-server-qml6c\" (UID: \"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42\") " pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.721275 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4rzsg\" (UniqueName: \"kubernetes.io/projected/2aaf45c7-302a-433d-b9dc-c22d6a978311-kube-api-access-4rzsg\") pod \"dns-default-8fkq9\" (UID: \"2aaf45c7-302a-433d-b9dc-c22d6a978311\") " pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.727023 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.737389 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xszmb\" (UniqueName: \"kubernetes.io/projected/4f6c8609-8788-4693-9d93-babd560187f1-kube-api-access-xszmb\") pod \"ingress-canary-gxcsb\" (UID: \"4f6c8609-8788-4693-9d93-babd560187f1\") " pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.742117 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.751395 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.758081 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.763172 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj"] Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.767345 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/machine-config-server-qml6c" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.780431 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gxcsb" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.795679 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.805174 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.805324 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.305304033 +0000 UTC m=+142.800325698 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.805360 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.805763 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.305753805 +0000 UTC m=+142.800775470 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.906761 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.907650 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.407629111 +0000 UTC m=+142.902650776 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: I0131 04:28:53.907780 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:53 crc kubenswrapper[4812]: E0131 04:28:53.908038 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.408031752 +0000 UTC m=+142.903053417 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:53 crc kubenswrapper[4812]: W0131 04:28:53.917670 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda2f275c0_4422_45c9_8d3b_c022a4322df5.slice/crio-88483d574a98ddab1c47ac31622f0f41110cf2ca3a6671cdd0bbdf7e2bfdea6a WatchSource:0}: Error finding container 88483d574a98ddab1c47ac31622f0f41110cf2ca3a6671cdd0bbdf7e2bfdea6a: Status 404 returned error can't find the container with id 88483d574a98ddab1c47ac31622f0f41110cf2ca3a6671cdd0bbdf7e2bfdea6a Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.008686 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.011741 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.510335559 +0000 UTC m=+143.005357224 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.113019 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.113466 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.613454329 +0000 UTC m=+143.108475994 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.114469 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" event={"ID":"dea03f3c-7f0b-4026-82e7-f3bc79397a29","Type":"ContainerStarted","Data":"45e5444c5f7ccbe4852a6f64fdbabfeab6a29566850325b6c811eda2f6476c9e"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.114511 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" event={"ID":"dea03f3c-7f0b-4026-82e7-f3bc79397a29","Type":"ContainerStarted","Data":"3dbc1016521e5891fa151893b752e8158dc8d8edfee40a47a26955944c8c8746"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.117045 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" event={"ID":"43e00fbb-667f-43f7-b399-31cfcea2ba2f","Type":"ContainerStarted","Data":"8a769b3e0d6821dc3d4049f35efacd53b2cd43361267f27a50bf23aabff007ed"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.117095 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" event={"ID":"43e00fbb-667f-43f7-b399-31cfcea2ba2f","Type":"ContainerStarted","Data":"c853f6fac708d8b4f91a7501805c9b992a74030388b386cc79fb52136fd95006"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.127316 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" event={"ID":"a2f275c0-4422-45c9-8d3b-c022a4322df5","Type":"ContainerStarted","Data":"88483d574a98ddab1c47ac31622f0f41110cf2ca3a6671cdd0bbdf7e2bfdea6a"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.136178 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" event={"ID":"e5c2893c-e678-4c5e-8692-8d50c2510ded","Type":"ContainerStarted","Data":"fdbc6aa365a2597a2e940501de231a0ff024a2ff3dd433e0e8824fe654533e85"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.136250 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" event={"ID":"e5c2893c-e678-4c5e-8692-8d50c2510ded","Type":"ContainerStarted","Data":"35f681c4d848283fefb6590d3699b4f73ad8e22ec031dd8616c9f1be14fd3579"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.136261 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" event={"ID":"e5c2893c-e678-4c5e-8692-8d50c2510ded","Type":"ContainerStarted","Data":"5eb4a60551076c17cb3626630b2a3a31248841cb326bb741389927d31ad6ae79"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.137692 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qml6c" event={"ID":"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42","Type":"ContainerStarted","Data":"7572cc91a8d73d5b2230e9c3df84aaf5ee5539ca9bb6af64ed838a525fbfd7c5"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.139234 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4f2hw" event={"ID":"ccd8432b-f254-450e-9b70-e0e89ead504d","Type":"ContainerStarted","Data":"9c5c3c2b33a89193fd18c0ce06742f218bc594e8ba47809ac69fad4415cd0eb8"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.139283 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-f9d7485db-4f2hw" event={"ID":"ccd8432b-f254-450e-9b70-e0e89ead504d","Type":"ContainerStarted","Data":"d2e18aa2c9fb9d9310c85db47ca4474f21670ed3f93f48bf4a9ae2a14828b498"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.145740 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" event={"ID":"ca2e5ff5-1897-488c-9823-706462fbc903","Type":"ContainerStarted","Data":"436959183a1abbe56993a41a72d66e1f94a456f46ee74f4de7bc126f8c19d54d"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.145789 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" event={"ID":"ca2e5ff5-1897-488c-9823-706462fbc903","Type":"ContainerStarted","Data":"165ffa5b476cbbba574c3d72dbe1965f00f5794c22af567253abaf513f1658db"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.151627 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2hcvc" event={"ID":"954336e2-74fe-443b-9bef-1247a8935c13","Type":"ContainerStarted","Data":"fa2d8368deeb0d02c5529a6bb2ad6aafb2272b739a755c1a574f152415d2d5cb"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.151671 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-5444994796-2hcvc" event={"ID":"954336e2-74fe-443b-9bef-1247a8935c13","Type":"ContainerStarted","Data":"dd45caa817ab864468010a932f43b6ee6cbe71e5be0d71a588bcc53cc34864a4"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.155360 4812 generic.go:334] "Generic (PLEG): container finished" podID="d8458779-3738-4315-a142-4b5287a2b8fa" containerID="f51d9454daa84d2293b02f372714f67f4aeb29e48778ee01696f4accaf92ac26" exitCode=0 Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.155407 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" event={"ID":"d8458779-3738-4315-a142-4b5287a2b8fa","Type":"ContainerDied","Data":"f51d9454daa84d2293b02f372714f67f4aeb29e48778ee01696f4accaf92ac26"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.157406 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" event={"ID":"d8458779-3738-4315-a142-4b5287a2b8fa","Type":"ContainerStarted","Data":"67921a5c8f0ee15b09ab9b44a693c7452e6cc6da75bb4ae25a1530261845fabb"} Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.214405 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.215400 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.715384707 +0000 UTC m=+143.210406362 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.288693 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.309772 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication-operator/authentication-operator-69f744f599-bw8rt"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.316500 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.317760 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.817745756 +0000 UTC m=+143.312767421 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.322256 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6246k"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.331328 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-57sr9"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.333169 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq"] Jan 31 04:28:54 crc kubenswrapper[4812]: W0131 04:28:54.353238 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5058ec63_1bc0_4113_b436_041e7e1a37f5.slice/crio-3708f9e1581d697ee500ade98f36252570fa0dd5943014586b22ddabe9492a5b WatchSource:0}: Error finding container 3708f9e1581d697ee500ade98f36252570fa0dd5943014586b22ddabe9492a5b: Status 404 returned error can't find the container with id 3708f9e1581d697ee500ade98f36252570fa0dd5943014586b22ddabe9492a5b Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.354169 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-apiserver/apiserver-76f77b778f-tt2tp"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.361487 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.418917 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.419317 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.919301623 +0000 UTC m=+143.414323288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.421037 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.421374 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:54.921365289 +0000 UTC m=+143.416386954 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.515583 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.524177 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.524322 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:28:54 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:28:54 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:28:54 crc kubenswrapper[4812]: healthz check failed Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.524381 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.524744 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.024725886 +0000 UTC m=+143.519747551 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.586682 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.627338 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.627704 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.127691981 +0000 UTC m=+143.622713646 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.647506 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-58897d9998-b2n85"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.686556 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.693783 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.698873 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.704249 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-etcd-operator/etcd-operator-b45778765-42rlr"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.707953 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/multus-admission-controller-857f4d67dd-2mv2d"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.710754 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.713096 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.727912 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.728300 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.228272882 +0000 UTC m=+143.723294547 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: W0131 04:28:54.782990 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode460d967_199b_41b2_a198_3acaaa1f4382.slice/crio-001e0602b6a0cf5cf20422810c79a991aa7e5538c5bbbf23facfea37ff77dcbd WatchSource:0}: Error finding container 001e0602b6a0cf5cf20422810c79a991aa7e5538c5bbbf23facfea37ff77dcbd: Status 404 returned error can't find the container with id 001e0602b6a0cf5cf20422810c79a991aa7e5538c5bbbf23facfea37ff77dcbd Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.829524 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.830720 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.330700473 +0000 UTC m=+143.825722138 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.925063 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.936047 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:54 crc kubenswrapper[4812]: E0131 04:28:54.936486 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.436438664 +0000 UTC m=+143.931460319 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.942861 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-8fkq9"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.950774 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.952356 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.970301 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["hostpath-provisioner/csi-hostpathplugin-lwmmp"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.970348 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8"] Jan 31 04:28:54 crc kubenswrapper[4812]: I0131 04:28:54.993127 4812 csr.go:261] certificate signing request csr-z68gb is approved, waiting to be issued Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.002145 4812 csr.go:257] certificate signing request csr-z68gb is issued Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.010028 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.012972 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gxcsb"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.027570 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wh59s"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.030982 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.031039 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.032582 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.033701 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.035171 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-7954f5f757-plj5m"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.037077 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.037405 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.537391765 +0000 UTC m=+144.032413430 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: W0131 04:28:55.067246 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7936c5a_60b1_4747_bdee_8de1c2952aa7.slice/crio-f0d4f89f69e2724d98f99a4c27fe47a212f8cac54e1a6129c4b9eade03c8de06 WatchSource:0}: Error finding container f0d4f89f69e2724d98f99a4c27fe47a212f8cac54e1a6129c4b9eade03c8de06: Status 404 returned error can't find the container with id f0d4f89f69e2724d98f99a4c27fe47a212f8cac54e1a6129c4b9eade03c8de06 Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.079934 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-9c57cc56f-7qlmj"] Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.139244 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.139428 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.639412225 +0000 UTC m=+144.134433890 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.139498 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.139785 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.639778874 +0000 UTC m=+144.134800539 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.206757 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" event={"ID":"6cfacc7d-5b16-47f9-8650-50a0810479ab","Type":"ContainerStarted","Data":"b6f7f84edf403d2e3c136b426c02fbe929e54857c8abba212b9aec865ca1ed21"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.219985 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" event={"ID":"1d5723db-1696-4fe1-a736-756e9bf39115","Type":"ContainerStarted","Data":"20b59b73d7eb79077ca13ea5a7dab1655e384b9233daa85ff400b97c65ccd8c3"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.236220 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" event={"ID":"71bda3a8-4993-48f0-abaf-300a04380ac7","Type":"ContainerStarted","Data":"4a4206acc74d2aa313b8a73c44866828cd2c524693f349440e8a298dac61b225"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.236254 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" event={"ID":"71bda3a8-4993-48f0-abaf-300a04380ac7","Type":"ContainerStarted","Data":"684baeead2aad15b37ed7c9d09a0a151caf98eb8d5f72c39369cd17c08f8ce9a"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.237466 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.240378 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.240716 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.740702934 +0000 UTC m=+144.235724599 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.241967 4812 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-wb7l5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.242017 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.246773 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" event={"ID":"68cca81f-4d4d-4a38-8a6e-ba856a013888","Type":"ContainerStarted","Data":"1ae0531a26a3f14680efebd623fccfd61ca0282e1599c1799aac0018a7663cdc"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.258627 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-5444994796-2hcvc" podStartSLOduration=120.258611613 podStartE2EDuration="2m0.258611613s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.256792702 +0000 UTC m=+143.751814367" watchObservedRunningTime="2026-01-31 04:28:55.258611613 +0000 UTC m=+143.753633278" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.259193 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-machine-approver/machine-approver-56656f9798-bf26c" podStartSLOduration=120.259189058 podStartE2EDuration="2m0.259189058s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.218481629 +0000 UTC m=+143.713503304" watchObservedRunningTime="2026-01-31 04:28:55.259189058 +0000 UTC m=+143.754210723" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.286243 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-b2n85" event={"ID":"df192deb-9b42-48e1-86d7-b85b217d6c1e","Type":"ContainerStarted","Data":"0e1362caf8ab5661bc4d47463342c8003bb977cc1c4fb16b69fa78b02a12caff"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.292231 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" event={"ID":"f9058f87-b187-44c0-b302-712072520e59","Type":"ContainerStarted","Data":"7c73a13136fbba1a9f7425e3683ebfb4dc70fc009d5ac7690bc3ee2931904f48"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.294017 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" event={"ID":"98c15692-b01e-415b-87e3-80184b4551f6","Type":"ContainerStarted","Data":"3c0b8aeb017c595b0a9f17960f594ef835ed6d91334feff1a79a90f6c7feb7a9"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.303865 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" event={"ID":"4526610c-79f9-4cbc-b67f-9e2c1149f7b2","Type":"ContainerStarted","Data":"9c166945d5cf1581eaba020f4229f66f364cc5a15860e8bfdee7f8234b361e79"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.316172 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-server-qml6c" event={"ID":"aec10d96-6bc3-4bd3-a7ad-5307c8f0ec42","Type":"ContainerStarted","Data":"996f3b98d2ef783897be26835fd4a18850666dd1113f4137deced630256e4238"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.334829 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" event={"ID":"9b7264a5-4a23-4af6-b990-c07f4da8d8c5","Type":"ContainerStarted","Data":"d1f921f9e58cfd44635797292f3cef2478bde5f1ca34bbfc4d9d2caa8f08bf5a"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.341880 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.342760 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.842748415 +0000 UTC m=+144.337770080 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.345209 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" event={"ID":"9e3da597-67fd-4e7c-8e35-4ef12610beef","Type":"ContainerStarted","Data":"808eb31ac592aaf9c5c4b08ea62f103bcc9daf13dbe91e0599fb44d4769da393"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.346456 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" event={"ID":"a2f275c0-4422-45c9-8d3b-c022a4322df5","Type":"ContainerStarted","Data":"1e1bdee7094f3342d56087307936e835ba065964f7d358e703e536d44ffc104d"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.376914 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8fkq9" event={"ID":"2aaf45c7-302a-433d-b9dc-c22d6a978311","Type":"ContainerStarted","Data":"0b462fc9869a1c800f6ab48ff6d56123cc64e74a2b235b61a682ee4313ceff5e"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.393427 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/machine-api-operator-5694c8668f-rm7wz" podStartSLOduration=120.393410425 podStartE2EDuration="2m0.393410425s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.335141737 +0000 UTC m=+143.830163402" watchObservedRunningTime="2026-01-31 04:28:55.393410425 +0000 UTC m=+143.888432090" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.394497 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver-operator/kube-apiserver-operator-766d6c64bb-2twh6" podStartSLOduration=120.394493315 podStartE2EDuration="2m0.394493315s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.393068656 +0000 UTC m=+143.888090321" watchObservedRunningTime="2026-01-31 04:28:55.394493315 +0000 UTC m=+143.889514980" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.395618 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" event={"ID":"41f9cb0c-1919-4647-bc7f-dfc345c0b6be","Type":"ContainerStarted","Data":"faa778a9fa580bd4e1d224453ce446bdcad1eda1ed9e44beed6c74d5915bda24"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.395645 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" event={"ID":"41f9cb0c-1919-4647-bc7f-dfc345c0b6be","Type":"ContainerStarted","Data":"7a6af771c696e6beaaba3d7ee15efb52153673b059d6daaf485b1a55cfd5159a"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.395654 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" event={"ID":"41f9cb0c-1919-4647-bc7f-dfc345c0b6be","Type":"ContainerStarted","Data":"657d1c218304ee0af3e4cb304d06281b4f3e5de5376bf87452034b8eebd92ce8"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.414093 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" event={"ID":"e460d967-199b-41b2-a198-3acaaa1f4382","Type":"ContainerStarted","Data":"ff7a79263a4c86268f5422c7a40a6df31d4108409d009da9657fbbdddded974f"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.414153 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" event={"ID":"e460d967-199b-41b2-a198-3acaaa1f4382","Type":"ContainerStarted","Data":"001e0602b6a0cf5cf20422810c79a991aa7e5538c5bbbf23facfea37ff77dcbd"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.431080 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" event={"ID":"0264ec2c-4f75-4ef5-9e27-f4f706275a0f","Type":"ContainerStarted","Data":"9ce86c85fb483cb262ab52ca9c2a31dc6fe2f9c4c895b20bbee57c3333ead9e9"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.442409 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.443330 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:55.943315495 +0000 UTC m=+144.438337160 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.448785 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" event={"ID":"43e00fbb-667f-43f7-b399-31cfcea2ba2f","Type":"ContainerStarted","Data":"ac469ffed544203f2080b5152ac502b8731b614e781da521d1cf2fce667b2302"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.475391 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-f9d7485db-4f2hw" podStartSLOduration=120.475360819 podStartE2EDuration="2m0.475360819s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.427155185 +0000 UTC m=+143.922176850" watchObservedRunningTime="2026-01-31 04:28:55.475360819 +0000 UTC m=+143.970382494" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.488853 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" event={"ID":"5058ec63-1bc0-4113-b436-041e7e1a37f5","Type":"ContainerStarted","Data":"2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.488894 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" event={"ID":"5058ec63-1bc0-4113-b436-041e7e1a37f5","Type":"ContainerStarted","Data":"3708f9e1581d697ee500ade98f36252570fa0dd5943014586b22ddabe9492a5b"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.489613 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.490791 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" event={"ID":"62e8a091-0ee4-4526-837c-d80a77d2c233","Type":"ContainerStarted","Data":"29a066c2c1cbbb0b64dd42b55cf4e286caabdcc23bdab5490a3e839e7dbfe368"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.492669 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" event={"ID":"d8458779-3738-4315-a142-4b5287a2b8fa","Type":"ContainerStarted","Data":"f1146835a25e79279896c0a342b90299373c471dd06db153fcf00442716c12ce"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.493224 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.493928 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" event={"ID":"39f52f71-fcee-4193-95db-158c8fe2f71f","Type":"ContainerStarted","Data":"ee792830d50fec32b4523dd47ae2a02e20fd4c62fcd3100a8dab68384745faea"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.494977 4812 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-57sr9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.495029 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.495401 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" event={"ID":"21423ada-b4f2-49f7-9cb7-edf1025fe79e","Type":"ContainerStarted","Data":"4650e358da0d28b218d9893cf5d3891cd51929a91820452a42756b5c73011279"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.495957 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gxcsb" event={"ID":"4f6c8609-8788-4693-9d93-babd560187f1","Type":"ContainerStarted","Data":"61eb594a92e42d951a1fabfdc53c84a6ac77b3e3fa324c5ac20d5b3e0e9999c2"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.497054 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" event={"ID":"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8","Type":"ContainerStarted","Data":"d91b644d5059017025c646fda8fbbaa5254878aa92fa33781b703ee6d4ed26e1"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.497076 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" event={"ID":"c4261f6c-8fb6-4c68-9fab-5c2f46afcca8","Type":"ContainerStarted","Data":"dc17980575712787c46a32f7f3adc994a0fda782b11cd76b4ec17c51aaeb08bb"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.498058 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" event={"ID":"e41729f9-a81f-47cb-895c-c7500855a522","Type":"ContainerStarted","Data":"911bb7cdcfc6b7cba0ec086347970dfa14c05da9a6dcde5a604381600816a7d9"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.498948 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" event={"ID":"094a78ab-06ca-4e51-aae3-577a1ee80df5","Type":"ContainerStarted","Data":"13233908a7d959fff2fe8ea505763b636cc64dd5de24cd9139c70a2b1cbb448b"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.499818 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" event={"ID":"7f6980a4-d26b-4132-8da4-650ed74e8a55","Type":"ContainerStarted","Data":"feba0409a0a0994a5eb3b78765c41c3313c0bb9b81c971bef55be6a009bfdaf6"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.499859 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" event={"ID":"7f6980a4-d26b-4132-8da4-650ed74e8a55","Type":"ContainerStarted","Data":"d76618cdce0930b294b532bd5a661cb4421109e48f4453873ed7b221968a4ec5"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.508772 4812 generic.go:334] "Generic (PLEG): container finished" podID="ca95cb69-0209-4b77-8d05-608c83cdddc2" containerID="9834639c277490e94728b4380d8fbce81667b6a00a3bbc29b5d4b590d173933c" exitCode=0 Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.508843 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" event={"ID":"ca95cb69-0209-4b77-8d05-608c83cdddc2","Type":"ContainerDied","Data":"9834639c277490e94728b4380d8fbce81667b6a00a3bbc29b5d4b590d173933c"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.508859 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" event={"ID":"ca95cb69-0209-4b77-8d05-608c83cdddc2","Type":"ContainerStarted","Data":"f67a0ca5f5dce2a9bf104f77fd1cb4aacbbdf8cf58714ff577d23571db86675f"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.513484 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" event={"ID":"13d46677-e2af-4751-b6cf-346aca6a8e46","Type":"ContainerStarted","Data":"7e80c9d8155b89c4cc31077eb33effe1b4a3a17745f071f9e56dd01a01d93a84"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.513510 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" event={"ID":"13d46677-e2af-4751-b6cf-346aca6a8e46","Type":"ContainerStarted","Data":"42b6fc6e10d2d125ac42709adf01fc3e7dcb77b527ec6d7ee881a23cf2984758"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.516549 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:28:55 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:28:55 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:28:55 crc kubenswrapper[4812]: healthz check failed Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.516591 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.518615 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager-operator/openshift-controller-manager-operator-756b6f6bc6-59zlj" podStartSLOduration=120.518593226 podStartE2EDuration="2m0.518593226s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.476421237 +0000 UTC m=+143.971442902" watchObservedRunningTime="2026-01-31 04:28:55.518593226 +0000 UTC m=+144.013614901" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.520269 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-server-qml6c" podStartSLOduration=5.520262182 podStartE2EDuration="5.520262182s" podCreationTimestamp="2026-01-31 04:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.515517662 +0000 UTC m=+144.010539327" watchObservedRunningTime="2026-01-31 04:28:55.520262182 +0000 UTC m=+144.015283847" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.536091 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" event={"ID":"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc","Type":"ContainerStarted","Data":"098748da72a159c8b96bdbc0c778d5a309cbf23b968142e7da6edac15584c5ea"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.536136 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" event={"ID":"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc","Type":"ContainerStarted","Data":"dc3fdfaea7df7c57524a86a94e6b6db75fca34a88b78488a9db9592d2e6f8e7d"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.537043 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.542171 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" event={"ID":"b7936c5a-60b1-4747-bdee-8de1c2952aa7","Type":"ContainerStarted","Data":"f0d4f89f69e2724d98f99a4c27fe47a212f8cac54e1a6129c4b9eade03c8de06"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.543238 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" event={"ID":"f3ab29b7-5bdf-4727-999f-8a0a9f104374","Type":"ContainerStarted","Data":"e3e2d2e461da5482b1565e4d1559c38a5eedfe6c3f66c6c4cd7257fcb30ff229"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.545255 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-plj5m" event={"ID":"b6b2d935-b2ef-4444-9c39-2c91695b9765","Type":"ContainerStarted","Data":"c28584f5fee7173a790d0deb6f42390fa69f62a94ee79de954916809e72bdd57"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.545574 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.546732 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.046717443 +0000 UTC m=+144.541739108 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.548832 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" event={"ID":"4b5bdd43-4fae-4142-8905-f30435ac9180","Type":"ContainerStarted","Data":"dbd9f68673caa778ade271b4c632b95f68a0f0f54a964e0daf30ccfdccfe4a4d"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.549190 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.551146 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" event={"ID":"88240dfe-5946-4e69-99e3-2a429675a53f","Type":"ContainerStarted","Data":"f54b0506d2bf77af35e17f592b7b1131a186aacb01d53931e02d94a6d67fd9a0"} Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.552241 4812 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6246k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.552273 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.554510 4812 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-j2ppp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.554537 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" podUID="4b5bdd43-4fae-4142-8905-f30435ac9180" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.569275 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-665b6dd947-7qdgm" podStartSLOduration=120.569263437 podStartE2EDuration="2m0.569263437s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.537049279 +0000 UTC m=+144.032070944" watchObservedRunningTime="2026-01-31 04:28:55.569263437 +0000 UTC m=+144.064285102" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.612336 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-api/control-plane-machine-set-operator-78cbb6b69f-qlf2j" podStartSLOduration=120.61232099 podStartE2EDuration="2m0.61232099s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.571496648 +0000 UTC m=+144.066518313" watchObservedRunningTime="2026-01-31 04:28:55.61232099 +0000 UTC m=+144.107342655" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.649281 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.650318 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.150304665 +0000 UTC m=+144.645326330 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.669197 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" podStartSLOduration=120.669177239 podStartE2EDuration="2m0.669177239s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.615105146 +0000 UTC m=+144.110126811" watchObservedRunningTime="2026-01-31 04:28:55.669177239 +0000 UTC m=+144.164198904" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.686225 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication-operator/authentication-operator-69f744f599-bw8rt" podStartSLOduration=120.683541711 podStartE2EDuration="2m0.683541711s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.664739139 +0000 UTC m=+144.159760814" watchObservedRunningTime="2026-01-31 04:28:55.683541711 +0000 UTC m=+144.178563376" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.707778 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" podStartSLOduration=120.707763171 podStartE2EDuration="2m0.707763171s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.706194368 +0000 UTC m=+144.201216033" watchObservedRunningTime="2026-01-31 04:28:55.707763171 +0000 UTC m=+144.202784836" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.731942 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" podStartSLOduration=120.731930179 podStartE2EDuration="2m0.731930179s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.730698846 +0000 UTC m=+144.225720511" watchObservedRunningTime="2026-01-31 04:28:55.731930179 +0000 UTC m=+144.226951844" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.751493 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.751769 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.25176016 +0000 UTC m=+144.746781825 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.775756 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" podStartSLOduration=120.775739303 podStartE2EDuration="2m0.775739303s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.775191948 +0000 UTC m=+144.270213613" watchObservedRunningTime="2026-01-31 04:28:55.775739303 +0000 UTC m=+144.270760968" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.853568 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.853848 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.35381529 +0000 UTC m=+144.848836955 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.853952 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.854383 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.354376325 +0000 UTC m=+144.849397990 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.855169 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" podStartSLOduration=120.855089945 podStartE2EDuration="2m0.855089945s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.853721388 +0000 UTC m=+144.348743073" watchObservedRunningTime="2026-01-31 04:28:55.855089945 +0000 UTC m=+144.350111610" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.896142 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver-operator/openshift-apiserver-operator-796bbdcf4f-lqsm5" podStartSLOduration=120.896123693 podStartE2EDuration="2m0.896123693s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.895758153 +0000 UTC m=+144.390779978" watchObservedRunningTime="2026-01-31 04:28:55.896123693 +0000 UTC m=+144.391145358" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.932573 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns-operator/dns-operator-744455d44c-fxvxq" podStartSLOduration=120.932555366 podStartE2EDuration="2m0.932555366s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:55.931807866 +0000 UTC m=+144.426829531" watchObservedRunningTime="2026-01-31 04:28:55.932555366 +0000 UTC m=+144.427577021" Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.954571 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.954736 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.45471457 +0000 UTC m=+144.949736235 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:55 crc kubenswrapper[4812]: I0131 04:28:55.954939 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:55 crc kubenswrapper[4812]: E0131 04:28:55.955235 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.455223323 +0000 UTC m=+144.950244988 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.004088 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Certificate expiration is 2027-01-31 04:23:54 +0000 UTC, rotation deadline is 2026-10-26 05:49:45.667302566 +0000 UTC Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.004132 4812 certificate_manager.go:356] kubernetes.io/kubelet-serving: Waiting 6433h20m49.66317322s for next certificate rotation Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.055516 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.055689 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.55566723 +0000 UTC m=+145.050688895 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.055987 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.056239 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.556231795 +0000 UTC m=+145.051253460 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.157202 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.157429 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.657395103 +0000 UTC m=+145.152416788 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.157694 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.158039 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.658026819 +0000 UTC m=+145.153048484 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.258970 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.259366 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.75935094 +0000 UTC m=+145.254372605 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.361260 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.361712 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.861685869 +0000 UTC m=+145.356707574 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.462061 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.462268 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.962230638 +0000 UTC m=+145.457252323 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.462338 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.462630 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:56.962617689 +0000 UTC m=+145.457639354 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.519225 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:28:56 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:28:56 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:28:56 crc kubenswrapper[4812]: healthz check failed Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.519285 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.562098 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" event={"ID":"055e189f-0506-4504-8eb0-6bfe3e9ec9e1","Type":"ContainerStarted","Data":"218f7d0cacbe724745b8715e14fb053fff08303c90f669d78c69e24a8289e7df"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.563044 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.563324 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.063280242 +0000 UTC m=+145.558301947 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.567988 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-58897d9998-b2n85" event={"ID":"df192deb-9b42-48e1-86d7-b85b217d6c1e","Type":"ContainerStarted","Data":"53cadfe313bab457e53ef50def2531fb0425525087648e347b21ebf0253e7ddd"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.569085 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.571703 4812 patch_prober.go:28] interesting pod/console-operator-58897d9998-b2n85 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.571771 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-b2n85" podUID="df192deb-9b42-48e1-86d7-b85b217d6c1e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.576748 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" event={"ID":"9e3da597-67fd-4e7c-8e35-4ef12610beef","Type":"ContainerStarted","Data":"345855e71e03ce7020275ab4ad424c6b05f1c1640f43146d4dd9feb6fa7640f7"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.587555 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" event={"ID":"4b5bdd43-4fae-4142-8905-f30435ac9180","Type":"ContainerStarted","Data":"6d6de5a4777e26f65160d6d119766c6940644a0acdb48c01cf78a37cdf6dc638"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.588347 4812 patch_prober.go:28] interesting pod/catalog-operator-68c6474976-j2ppp container/catalog-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" start-of-body= Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.588405 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" podUID="4b5bdd43-4fae-4142-8905-f30435ac9180" containerName="catalog-operator" probeResult="failure" output="Get \"https://10.217.0.32:8443/healthz\": dial tcp 10.217.0.32:8443: connect: connection refused" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.613242 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" event={"ID":"21423ada-b4f2-49f7-9cb7-edf1025fe79e","Type":"ContainerStarted","Data":"7bad1271f91c211d3a5011b05f0ca82f7cdec55c528b4ddd66844f87c0ab4bb9"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.626598 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" event={"ID":"13d46677-e2af-4751-b6cf-346aca6a8e46","Type":"ContainerStarted","Data":"43d3ce1b0f89166f540a76de194054bc7d44e177feefafcb1ea77f3649cb9bce"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.627469 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.646489 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" event={"ID":"4526610c-79f9-4cbc-b67f-9e2c1149f7b2","Type":"ContainerStarted","Data":"e404772b9536f8f978f223be44dbcd022703ac5d6df44641a10c7a61f75de4c8"} Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.646559 4812 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-wb7l5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.646778 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.654828 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-58897d9998-b2n85" podStartSLOduration=121.654807556 podStartE2EDuration="2m1.654807556s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:56.593484345 +0000 UTC m=+145.088506030" watchObservedRunningTime="2026-01-31 04:28:56.654807556 +0000 UTC m=+145.149829221" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.656114 4812 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6246k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.656169 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.661132 4812 patch_prober.go:28] interesting pod/oauth-openshift-558db77b4-57sr9 container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" start-of-body= Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.661208 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.23:6443/healthz\": dial tcp 10.217.0.23:6443: connect: connection refused" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.665184 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" podStartSLOduration=121.665159087 podStartE2EDuration="2m1.665159087s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:56.655502194 +0000 UTC m=+145.150523869" watchObservedRunningTime="2026-01-31 04:28:56.665159087 +0000 UTC m=+145.160180742" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.665283 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.665606 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.16559172 +0000 UTC m=+145.660613385 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.673884 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-etcd-operator/etcd-operator-b45778765-42rlr" podStartSLOduration=121.673870226 podStartE2EDuration="2m1.673870226s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:56.673567467 +0000 UTC m=+145.168589132" watchObservedRunningTime="2026-01-31 04:28:56.673870226 +0000 UTC m=+145.168891891" Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.767084 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.767277 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.267256679 +0000 UTC m=+145.762278344 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.768154 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.770180 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.270171489 +0000 UTC m=+145.765193154 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.869544 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.870390 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.37037154 +0000 UTC m=+145.865393205 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:56 crc kubenswrapper[4812]: I0131 04:28:56.971316 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:56 crc kubenswrapper[4812]: E0131 04:28:56.972933 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.472913263 +0000 UTC m=+145.967934978 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.073524 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.073859 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.573831763 +0000 UTC m=+146.068853428 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.174609 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.174957 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.674946978 +0000 UTC m=+146.169968633 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.275703 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.276339 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.776324161 +0000 UTC m=+146.271345826 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.377299 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.377643 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.877633581 +0000 UTC m=+146.372655236 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.478548 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.478713 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.978688034 +0000 UTC m=+146.473709699 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.479004 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.479393 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:57.979374914 +0000 UTC m=+146.474396579 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.517575 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:28:57 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:28:57 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:28:57 crc kubenswrapper[4812]: healthz check failed Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.517631 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.580614 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.581077 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.081058634 +0000 UTC m=+146.576080299 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.672206 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" event={"ID":"68cca81f-4d4d-4a38-8a6e-ba856a013888","Type":"ContainerStarted","Data":"1a29e6e50b5f35786c1c0a3eadae83177455cbdd531926630f010e055ae1f69b"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.672745 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.674986 4812 patch_prober.go:28] interesting pod/packageserver-d55dfcdfc-bpzbl container/packageserver namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" start-of-body= Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.675031 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" podUID="68cca81f-4d4d-4a38-8a6e-ba856a013888" containerName="packageserver" probeResult="failure" output="Get \"https://10.217.0.42:5443/healthz\": dial tcp 10.217.0.42:5443: connect: connection refused" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.675803 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" event={"ID":"62e8a091-0ee4-4526-837c-d80a77d2c233","Type":"ContainerStarted","Data":"25f10abf3d4cacdab100b322ad27c6e554046bbefcf81eeb0866b4ebb299989f"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.678171 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-7954f5f757-plj5m" event={"ID":"b6b2d935-b2ef-4444-9c39-2c91695b9765","Type":"ContainerStarted","Data":"f845954310eb30586311b918f45cb50230235c52a151cea1ca675af2f944b5c4"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.678915 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.680999 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.681044 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.681560 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.681886 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.181875111 +0000 UTC m=+146.676896776 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.682991 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" event={"ID":"1d5723db-1696-4fe1-a736-756e9bf39115","Type":"ContainerStarted","Data":"5253eb5a05fb0be7a9c86b4254bcd7ed80c9c6440c7c6aed5f218e666951e399"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.686275 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" event={"ID":"ca95cb69-0209-4b77-8d05-608c83cdddc2","Type":"ContainerStarted","Data":"d27ca130b6445b49182fcf12fd9a3d511aadbc34ba5427c88f146fd5bf205756"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.686311 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" event={"ID":"ca95cb69-0209-4b77-8d05-608c83cdddc2","Type":"ContainerStarted","Data":"302c7b960356a53b546039ab336b4825100a5c5a595c96773630ea090d307bc1"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.687247 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" event={"ID":"9b7264a5-4a23-4af6-b990-c07f4da8d8c5","Type":"ContainerStarted","Data":"54a4e2f092d5e1dabed4fe32fddeacc04c9501cb862e388484f8ab3cbca23f98"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.687270 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" event={"ID":"9b7264a5-4a23-4af6-b990-c07f4da8d8c5","Type":"ContainerStarted","Data":"ef960a380ef0844f00cff5c5428c41925b2f8724953abf0ede3f0eaf1cb50ecb"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.689389 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" event={"ID":"39f52f71-fcee-4193-95db-158c8fe2f71f","Type":"ContainerStarted","Data":"869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.690084 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.691380 4812 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wh59s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.691421 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.692271 4812 generic.go:334] "Generic (PLEG): container finished" podID="f3ab29b7-5bdf-4727-999f-8a0a9f104374" containerID="5db4d875f0376f843771f95818a6c617e5e238ba973623c77bbc3d6ca892459d" exitCode=0 Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.692316 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" event={"ID":"f3ab29b7-5bdf-4727-999f-8a0a9f104374","Type":"ContainerDied","Data":"5db4d875f0376f843771f95818a6c617e5e238ba973623c77bbc3d6ca892459d"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.693484 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" event={"ID":"055e189f-0506-4504-8eb0-6bfe3e9ec9e1","Type":"ContainerStarted","Data":"0367bcbcd69791d4a4e28d8a0bb94e7a13c583c8e1117eb5f531e4e96fd4bf80"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.697084 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8fkq9" event={"ID":"2aaf45c7-302a-433d-b9dc-c22d6a978311","Type":"ContainerStarted","Data":"fd966aeff8858629bae68997143e8dc67396146ae13a73149cdae34bf7707943"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.697109 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-8fkq9" event={"ID":"2aaf45c7-302a-433d-b9dc-c22d6a978311","Type":"ContainerStarted","Data":"911dd6c143a0d0f00a5cf3166510e87223cd3619f1cf95aec8e730121590c726"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.697460 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-dns/dns-default-8fkq9" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.702204 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gxcsb" event={"ID":"4f6c8609-8788-4693-9d93-babd560187f1","Type":"ContainerStarted","Data":"497df1ef4d57a37660ee94754a19500df7d030aafd525f9c42de7e52dd97d273"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.706493 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" event={"ID":"0264ec2c-4f75-4ef5-9e27-f4f706275a0f","Type":"ContainerStarted","Data":"f4619e27200bb8aba162b27c759cb23cc6f76e4a3d42f6087ebdbbc1232427f4"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.706520 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" event={"ID":"0264ec2c-4f75-4ef5-9e27-f4f706275a0f","Type":"ContainerStarted","Data":"a3f31248bd87e0983d701ce48ba1464f5f6812efedd905dcc87d7b713857c7b1"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.709666 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" event={"ID":"6cfacc7d-5b16-47f9-8650-50a0810479ab","Type":"ContainerStarted","Data":"851feb0dfa907b7674c6de0d8b7a87fcfe18e82a6ff229eb90cf38e138130717"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.710470 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" podStartSLOduration=122.71046061 podStartE2EDuration="2m2.71046061s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.707737556 +0000 UTC m=+146.202759221" watchObservedRunningTime="2026-01-31 04:28:57.71046061 +0000 UTC m=+146.205482275" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.715302 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" event={"ID":"e41729f9-a81f-47cb-895c-c7500855a522","Type":"ContainerStarted","Data":"a37f8a1fae5f06db20b6c7dbf09c63bbc3c7383b28b4ed0a8661ce74235120e9"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.715343 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" event={"ID":"e41729f9-a81f-47cb-895c-c7500855a522","Type":"ContainerStarted","Data":"b0feac61afc73dadad80d4e6d5e37ce145ccc6684d532dcb4e254b0a0da2cb1e"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.722942 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" event={"ID":"88240dfe-5946-4e69-99e3-2a429675a53f","Type":"ContainerStarted","Data":"ba4d41c6991ed1c9974eaf2f2ae97100934748634d5ca8abadd000284603e1bd"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.730914 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" event={"ID":"98c15692-b01e-415b-87e3-80184b4551f6","Type":"ContainerStarted","Data":"850d15e594d1827a2734230097790fb2ad649ab00bb776a1654ae4a09db5594c"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.737304 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" event={"ID":"094a78ab-06ca-4e51-aae3-577a1ee80df5","Type":"ContainerStarted","Data":"2ccfe198b911f91689c4868fb880f77cc3854f75de377a799395db48fc8a8b0a"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.737339 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" event={"ID":"094a78ab-06ca-4e51-aae3-577a1ee80df5","Type":"ContainerStarted","Data":"098d43a186dc9dea6ac777f9b3e9385ea75eef1c84019fbe5cff8a6e6fb0f8d1"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.740339 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" event={"ID":"f9058f87-b187-44c0-b302-712072520e59","Type":"ContainerStarted","Data":"0e024e2a2e64fc30dcf4c9d993dad7f01963e63678798445ae0b33cef42d6fe9"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.749316 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gxcsb" podStartSLOduration=7.7493022190000005 podStartE2EDuration="7.749302219s" podCreationTimestamp="2026-01-31 04:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.74714623 +0000 UTC m=+146.242167895" watchObservedRunningTime="2026-01-31 04:28:57.749302219 +0000 UTC m=+146.244323884" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.749600 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" podStartSLOduration=122.749596347 podStartE2EDuration="2m2.749596347s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.728883022 +0000 UTC m=+146.223904687" watchObservedRunningTime="2026-01-31 04:28:57.749596347 +0000 UTC m=+146.244618002" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.752489 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" event={"ID":"b7936c5a-60b1-4747-bdee-8de1c2952aa7","Type":"ContainerStarted","Data":"e934a3764468f06828480409d414a458b78951cc545dc1da6c8e02526a38eff2"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.752522 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" event={"ID":"b7936c5a-60b1-4747-bdee-8de1c2952aa7","Type":"ContainerStarted","Data":"468c191420d7cb558a4769c19c882a63e7faf84e5101b044ea42b645f6ddf47e"} Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.756725 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.756802 4812 patch_prober.go:28] interesting pod/console-operator-58897d9998-b2n85 container/console-operator namespace/openshift-console-operator: Readiness probe status=failure output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" start-of-body= Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.756857 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console-operator/console-operator-58897d9998-b2n85" podUID="df192deb-9b42-48e1-86d7-b85b217d6c1e" containerName="console-operator" probeResult="failure" output="Get \"https://10.217.0.15:8443/readyz\": dial tcp 10.217.0.15:8443: connect: connection refused" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.760113 4812 patch_prober.go:28] interesting pod/olm-operator-6b444d44fb-lhxxt container/olm-operator namespace/openshift-operator-lifecycle-manager: Readiness probe status=failure output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" start-of-body= Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.760170 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" podUID="9e3da597-67fd-4e7c-8e35-4ef12610beef" containerName="olm-operator" probeResult="failure" output="Get \"https://10.217.0.39:8443/healthz\": dial tcp 10.217.0.39:8443: connect: connection refused" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.764749 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-config-operator/openshift-config-operator-7777fb866f-qcsf7" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.767698 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.773458 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/catalog-operator-68c6474976-j2ppp" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.782490 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" podStartSLOduration=122.782473092 podStartE2EDuration="2m2.782473092s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.780406566 +0000 UTC m=+146.275428231" watchObservedRunningTime="2026-01-31 04:28:57.782473092 +0000 UTC m=+146.277494757" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.783341 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.784779 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.284764325 +0000 UTC m=+146.779785990 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.856445 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-777779d784-sk6pl" podStartSLOduration=122.856428667 podStartE2EDuration="2m2.856428667s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.855186693 +0000 UTC m=+146.350208358" watchObservedRunningTime="2026-01-31 04:28:57.856428667 +0000 UTC m=+146.351450322" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.858001 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-9c57cc56f-7qlmj" podStartSLOduration=121.85799405 podStartE2EDuration="2m1.85799405s" podCreationTimestamp="2026-01-31 04:26:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.813384954 +0000 UTC m=+146.308406619" watchObservedRunningTime="2026-01-31 04:28:57.85799405 +0000 UTC m=+146.353015715" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.887926 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.892740 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.392727196 +0000 UTC m=+146.887748861 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.923185 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" podStartSLOduration=122.923169826 podStartE2EDuration="2m2.923169826s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.886448186 +0000 UTC m=+146.381469861" watchObservedRunningTime="2026-01-31 04:28:57.923169826 +0000 UTC m=+146.418191491" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.923774 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-operator/ingress-operator-5b745b69d9-qv56p" podStartSLOduration=122.923770132 podStartE2EDuration="2m2.923770132s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.922634711 +0000 UTC m=+146.417656376" watchObservedRunningTime="2026-01-31 04:28:57.923770132 +0000 UTC m=+146.418791797" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.955457 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-7954f5f757-plj5m" podStartSLOduration=122.955441145 podStartE2EDuration="2m2.955441145s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:57.952806373 +0000 UTC m=+146.447828038" watchObservedRunningTime="2026-01-31 04:28:57.955441145 +0000 UTC m=+146.450462810" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.957394 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.957424 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.958914 4812 patch_prober.go:28] interesting pod/apiserver-76f77b778f-tt2tp container/openshift-apiserver namespace/openshift-apiserver: Startup probe status=failure output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" start-of-body= Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.958978 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" podUID="ca95cb69-0209-4b77-8d05-608c83cdddc2" containerName="openshift-apiserver" probeResult="failure" output="Get \"https://10.217.0.5:8443/livez\": dial tcp 10.217.0.5:8443: connect: connection refused" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.987147 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:28:57 crc kubenswrapper[4812]: I0131 04:28:57.989582 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:57 crc kubenswrapper[4812]: E0131 04:28:57.989960 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.489946655 +0000 UTC m=+146.984968320 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.040694 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-8fkq9" podStartSLOduration=8.040677408 podStartE2EDuration="8.040677408s" podCreationTimestamp="2026-01-31 04:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.040000559 +0000 UTC m=+146.535022224" watchObservedRunningTime="2026-01-31 04:28:58.040677408 +0000 UTC m=+146.535699073" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.090945 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.091435 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.59141657 +0000 UTC m=+147.086438235 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.092988 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-operator-74547568cd-jnqk4" podStartSLOduration=123.092976933 podStartE2EDuration="2m3.092976933s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.0921433 +0000 UTC m=+146.587164965" watchObservedRunningTime="2026-01-31 04:28:58.092976933 +0000 UTC m=+146.587998598" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.094224 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-admission-controller-857f4d67dd-2mv2d" podStartSLOduration=123.094218577 podStartE2EDuration="2m3.094218577s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.070286195 +0000 UTC m=+146.565307860" watchObservedRunningTime="2026-01-31 04:28:58.094218577 +0000 UTC m=+146.589240242" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.192278 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.192694 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.692679209 +0000 UTC m=+147.187700874 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.199984 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager-operator/kube-controller-manager-operator-78b949d7b-vc7bf" podStartSLOduration=123.199970638 podStartE2EDuration="2m3.199970638s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.199170546 +0000 UTC m=+146.694192211" watchObservedRunningTime="2026-01-31 04:28:58.199970638 +0000 UTC m=+146.694992303" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.293513 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.293789 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.793778414 +0000 UTC m=+147.288800079 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.308778 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/machine-config-controller-84d6567774-bvhp8" podStartSLOduration=123.308760222 podStartE2EDuration="2m3.308760222s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.233006338 +0000 UTC m=+146.728028003" watchObservedRunningTime="2026-01-31 04:28:58.308760222 +0000 UTC m=+146.803781887" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.334925 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" podStartSLOduration=123.334907885 podStartE2EDuration="2m3.334907885s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.310200171 +0000 UTC m=+146.805221836" watchObservedRunningTime="2026-01-31 04:28:58.334907885 +0000 UTC m=+146.829929550" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.385883 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-scheduler-operator/openshift-kube-scheduler-operator-5fdd9b5758-n6qxf" podStartSLOduration=123.385867653 podStartE2EDuration="2m3.385867653s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.335263054 +0000 UTC m=+146.830284719" watchObservedRunningTime="2026-01-31 04:28:58.385867653 +0000 UTC m=+146.880889318" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.393938 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.394127 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.894100428 +0000 UTC m=+147.389122093 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.394230 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.394556 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.89454926 +0000 UTC m=+147.389570925 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.454435 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-59844c95c7-zwdl9" podStartSLOduration=123.454417711 podStartE2EDuration="2m3.454417711s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.385392231 +0000 UTC m=+146.880413896" watchObservedRunningTime="2026-01-31 04:28:58.454417711 +0000 UTC m=+146.949439366" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.495203 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.495488 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:58.99547364 +0000 UTC m=+147.490495305 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.517049 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:28:58 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:28:58 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:28:58 crc kubenswrapper[4812]: healthz check failed Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.517097 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.585584 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-b67b599dd-8mhvg" podStartSLOduration=123.585567855 podStartE2EDuration="2m3.585567855s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.541816343 +0000 UTC m=+147.036838018" watchObservedRunningTime="2026-01-31 04:28:58.585567855 +0000 UTC m=+147.080589510" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.596959 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.597314 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.097296514 +0000 UTC m=+147.592318179 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.659854 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/cluster-image-registry-operator-dc59b4c8b-lfhhx" podStartSLOduration=123.659825278 podStartE2EDuration="2m3.659825278s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:58.618388869 +0000 UTC m=+147.113410534" watchObservedRunningTime="2026-01-31 04:28:58.659825278 +0000 UTC m=+147.154846943" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.697501 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.697872 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.197857814 +0000 UTC m=+147.692879479 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.771577 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" event={"ID":"f3ab29b7-5bdf-4727-999f-8a0a9f104374","Type":"ContainerStarted","Data":"409b9d8b2f28af55414b5aee02ed6e420e831d16dbb50e0dc47c4e19b1bb2c7b"} Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.775620 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" event={"ID":"88240dfe-5946-4e69-99e3-2a429675a53f","Type":"ContainerStarted","Data":"0b119e341186221e11641cd254e8e968d85401362ae6f4cf08f48f463661e7b1"} Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.786724 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.786780 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.787053 4812 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wh59s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.787113 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.792642 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/olm-operator-6b444d44fb-lhxxt" Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.798650 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.799064 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.299049611 +0000 UTC m=+147.794071276 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.901030 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.901306 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.401281797 +0000 UTC m=+147.896303462 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.901751 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:58 crc kubenswrapper[4812]: E0131 04:28:58.904635 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.404620508 +0000 UTC m=+147.899642173 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:58 crc kubenswrapper[4812]: I0131 04:28:58.950772 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/packageserver-d55dfcdfc-bpzbl" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.003273 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.003544 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.503530623 +0000 UTC m=+147.998552288 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.075072 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" podStartSLOduration=124.075055652 podStartE2EDuration="2m4.075055652s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:59.063647731 +0000 UTC m=+147.558669396" watchObservedRunningTime="2026-01-31 04:28:59.075055652 +0000 UTC m=+147.570077327" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.079184 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-58897d9998-b2n85" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.105021 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.105425 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.605407879 +0000 UTC m=+148.100429544 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.206061 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.206391 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.7063766 +0000 UTC m=+148.201398265 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.309507 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.309771 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.809760137 +0000 UTC m=+148.304781802 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.408564 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-sj99w"] Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.409417 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.410079 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.410239 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:28:59.910215915 +0000 UTC m=+148.405237580 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.416824 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.428761 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sj99w"] Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.511967 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.512041 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.512083 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2t89x\" (UniqueName: \"kubernetes.io/projected/d9c1a0d3-b881-4382-89c4-905ad455a360-kube-api-access-2t89x\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.512119 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-utilities\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.512151 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-catalog-content\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.513116 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-nginx-conf\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.513131 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.013115058 +0000 UTC m=+148.508136723 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.518948 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:28:59 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:28:59 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:28:59 crc kubenswrapper[4812]: healthz check failed Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.518983 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.595357 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-56x2d"] Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.596517 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.599437 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.608476 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56x2d"] Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.612734 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.612875 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.112852606 +0000 UTC m=+148.607874271 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613037 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613077 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613104 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2t89x\" (UniqueName: \"kubernetes.io/projected/d9c1a0d3-b881-4382-89c4-905ad455a360-kube-api-access-2t89x\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613137 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613160 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-utilities\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613181 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613199 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-catalog-content\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.613348 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.113338199 +0000 UTC m=+148.608359864 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.613860 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-catalog-content\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.614089 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-utilities\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.618478 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/5fe485a1-e14f-4c09-b5b9-f252bc42b7e8-networking-console-plugin-cert\") pod \"networking-console-plugin-85b44fc459-gdk6g\" (UID: \"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8\") " pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.618670 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s2dwl\" (UniqueName: \"kubernetes.io/projected/9d751cbb-f2e2-430d-9754-c882a5e924a5-kube-api-access-s2dwl\") pod \"network-check-source-55646444c4-trplf\" (UID: \"9d751cbb-f2e2-430d-9754-c882a5e924a5\") " pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.621367 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-cqllr\" (UniqueName: \"kubernetes.io/projected/3b6479f0-333b-4a96-9adf-2099afdc2447-kube-api-access-cqllr\") pod \"network-check-target-xd92c\" (UID: \"3b6479f0-333b-4a96-9adf-2099afdc2447\") " pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.668733 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2t89x\" (UniqueName: \"kubernetes.io/projected/d9c1a0d3-b881-4382-89c4-905ad455a360-kube-api-access-2t89x\") pod \"certified-operators-sj99w\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.714482 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.714616 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.214598758 +0000 UTC m=+148.709620423 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.714653 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tx8z\" (UniqueName: \"kubernetes.io/projected/a16a82f9-4289-4749-bc62-df59dacefac1-kube-api-access-7tx8z\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.714685 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.714746 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-catalog-content\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.714778 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-utilities\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.715038 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.215031521 +0000 UTC m=+148.710053186 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.738981 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.806925 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.806964 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.807394 4812 patch_prober.go:28] interesting pod/marketplace-operator-79b997595-wh59s container/marketplace-operator namespace/openshift-marketplace: Readiness probe status=failure output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" start-of-body= Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.807444 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" probeResult="failure" output="Get \"http://10.217.0.22:8080/healthz\": dial tcp 10.217.0.22:8080: connect: connection refused" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.807577 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" event={"ID":"88240dfe-5946-4e69-99e3-2a429675a53f","Type":"ContainerStarted","Data":"1f3efcb7808404cb20d000dfb4334da35e22f0bbba4993878a9c22bff7345e15"} Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.807605 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" event={"ID":"88240dfe-5946-4e69-99e3-2a429675a53f","Type":"ContainerStarted","Data":"714775bf92f9edfcb10498bb6658e68cf1a4813cb8a67ef4c3f5371ff33575b1"} Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.815733 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.815901 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7tx8z\" (UniqueName: \"kubernetes.io/projected/a16a82f9-4289-4749-bc62-df59dacefac1-kube-api-access-7tx8z\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.815947 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.315914829 +0000 UTC m=+148.810936494 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.815996 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.816104 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-catalog-content\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.816212 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-utilities\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.816272 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.316259619 +0000 UTC m=+148.811281284 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.816594 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-catalog-content\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.816874 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-utilities\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.820401 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-94kb7"] Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.821343 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.834425 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94kb7"] Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.849909 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="hostpath-provisioner/csi-hostpathplugin-lwmmp" podStartSLOduration=9.849892015 podStartE2EDuration="9.849892015s" podCreationTimestamp="2026-01-31 04:28:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:28:59.847651723 +0000 UTC m=+148.342673388" watchObservedRunningTime="2026-01-31 04:28:59.849892015 +0000 UTC m=+148.344913690" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.859801 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tx8z\" (UniqueName: \"kubernetes.io/projected/a16a82f9-4289-4749-bc62-df59dacefac1-kube-api-access-7tx8z\") pod \"community-operators-56x2d\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.879624 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.892366 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.910208 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.910385 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.918451 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.918909 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8ffw8\" (UniqueName: \"kubernetes.io/projected/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-kube-api-access-8ffw8\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.919087 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-utilities\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:28:59 crc kubenswrapper[4812]: I0131 04:28:59.919139 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-catalog-content\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:28:59 crc kubenswrapper[4812]: E0131 04:28:59.919855 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.41982063 +0000 UTC m=+148.914842295 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.005467 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-2z5tv"] Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.006409 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.023493 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-catalog-content\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.023552 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8ffw8\" (UniqueName: \"kubernetes.io/projected/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-kube-api-access-8ffw8\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.023592 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.023611 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-utilities\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.024251 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-utilities\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.024442 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-catalog-content\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.024795 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.524785351 +0000 UTC m=+149.019807016 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.030833 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z5tv"] Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.063760 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8ffw8\" (UniqueName: \"kubernetes.io/projected/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-kube-api-access-8ffw8\") pod \"certified-operators-94kb7\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.126707 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.126860 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h8dn5\" (UniqueName: \"kubernetes.io/projected/e97d5bed-3c11-40af-869e-53c260650edb-kube-api-access-h8dn5\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.126900 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-catalog-content\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.126919 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-utilities\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.127055 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.627040887 +0000 UTC m=+149.122062552 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.135150 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.215752 4812 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.227689 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-catalog-content\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.227731 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-utilities\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.227810 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h8dn5\" (UniqueName: \"kubernetes.io/projected/e97d5bed-3c11-40af-869e-53c260650edb-kube-api-access-h8dn5\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.227852 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.228098 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.72808709 +0000 UTC m=+149.223108755 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.228453 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-catalog-content\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.228716 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-utilities\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.261342 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h8dn5\" (UniqueName: \"kubernetes.io/projected/e97d5bed-3c11-40af-869e-53c260650edb-kube-api-access-h8dn5\") pod \"community-operators-2z5tv\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.328757 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.329144 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.829128703 +0000 UTC m=+149.324150368 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.376512 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.430176 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.430603 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:00.930591288 +0000 UTC m=+149.425612943 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.435986 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-sj99w"] Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.529544 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:00 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:00 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:00 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.529591 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.533418 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.533722 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:01.033706627 +0000 UTC m=+149.528728292 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.634992 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.635297 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:01.135282865 +0000 UTC m=+149.630304530 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.736106 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.736369 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:01.236336369 +0000 UTC m=+149.731358034 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.736522 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.736882 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName: nodeName:}" failed. No retries permitted until 2026-01-31 04:29:01.236874843 +0000 UTC m=+149.731896508 (durationBeforeRetry 500ms). Error: MountVolume.MountDevice failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "image-registry-697d97f7c8-5c747" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b") : kubernetes.io/csi: attacher.MountDevice failed to create newCsiDriverClient: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.804866 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-56x2d"] Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.836478 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f193c548b987927ccb45590885fe0232d874029877707ddd437b700b8d0d175b"} Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.842399 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:29:00 crc kubenswrapper[4812]: E0131 04:29:00.842803 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8 podName:8f668bae-612b-4b75-9490-919e737c6a3b nodeName:}" failed. No retries permitted until 2026-01-31 04:29:01.342787649 +0000 UTC m=+149.837809304 (durationBeforeRetry 500ms). Error: UnmountVolume.TearDown failed for volume "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b") : kubernetes.io/csi: Unmounter.TearDownAt failed to get CSI client: driver name kubevirt.io.hostpath-provisioner not found in the list of registered CSI drivers Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.853858 4812 generic.go:334] "Generic (PLEG): container finished" podID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerID="79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062" exitCode=0 Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.854462 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sj99w" event={"ID":"d9c1a0d3-b881-4382-89c4-905ad455a360","Type":"ContainerDied","Data":"79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062"} Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.854490 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sj99w" event={"ID":"d9c1a0d3-b881-4382-89c4-905ad455a360","Type":"ContainerStarted","Data":"9a1f35ffa250e51d2c1f4966b18ada5533fef5d912eb4d60dde54af38de4ada4"} Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.860024 4812 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.910888 4812 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/kubevirt.io.hostpath-provisioner-reg.sock","Timestamp":"2026-01-31T04:29:00.215776544Z","Handler":null,"Name":""} Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.928129 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-94kb7"] Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.938981 4812 csi_plugin.go:100] kubernetes.io/csi: Trying to validate a new CSI Driver with name: kubevirt.io.hostpath-provisioner endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock versions: 1.0.0 Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.939014 4812 csi_plugin.go:113] kubernetes.io/csi: Register new plugin with name: kubevirt.io.hostpath-provisioner at endpoint: /var/lib/kubelet/plugins/csi-hostpath/csi.sock Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.943476 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.966807 4812 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 04:29:00 crc kubenswrapper[4812]: I0131 04:29:00.966858 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/1f4776af88835e41c12b831b4c9fed40233456d14189815a54dbe7f892fc1983/globalmount\"" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.035798 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-2z5tv"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.051799 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-697d97f7c8-5c747\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.146201 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"8f668bae-612b-4b75-9490-919e737c6a3b\" (UID: \"8f668bae-612b-4b75-9490-919e737c6a3b\") " Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.185888 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8") pod "8f668bae-612b-4b75-9490-919e737c6a3b" (UID: "8f668bae-612b-4b75-9490-919e737c6a3b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.330111 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.518982 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:01 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:01 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:01 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.519195 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.565122 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5c747"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.591963 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-bfclc"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.592860 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.595241 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.616967 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfclc"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.653338 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w8v6l\" (UniqueName: \"kubernetes.io/projected/d6e79cce-8b4e-491b-a976-a3649e3566cd-kube-api-access-w8v6l\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.653392 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-catalog-content\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.653464 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-utilities\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.754278 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-utilities\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.754528 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w8v6l\" (UniqueName: \"kubernetes.io/projected/d6e79cce-8b4e-491b-a976-a3649e3566cd-kube-api-access-w8v6l\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.754554 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-catalog-content\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.754682 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-utilities\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.754872 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-catalog-content\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.767016 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.767748 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.770315 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager"/"installer-sa-dockercfg-kjl2n" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.770732 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager"/"kube-root-ca.crt" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.771667 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w8v6l\" (UniqueName: \"kubernetes.io/projected/d6e79cce-8b4e-491b-a976-a3649e3566cd-kube-api-access-w8v6l\") pod \"redhat-marketplace-bfclc\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.776714 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.856600 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.856754 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.864934 4812 generic.go:334] "Generic (PLEG): container finished" podID="a16a82f9-4289-4749-bc62-df59dacefac1" containerID="9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b" exitCode=0 Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.864996 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56x2d" event={"ID":"a16a82f9-4289-4749-bc62-df59dacefac1","Type":"ContainerDied","Data":"9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.865024 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56x2d" event={"ID":"a16a82f9-4289-4749-bc62-df59dacefac1","Type":"ContainerStarted","Data":"3c2f9ecd49074dbef0afbbf95d2fa12287e098f61c07e2b5a3bec7ae05b19908"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.867098 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" event={"ID":"a2bf1dee-41d8-4797-ae33-0e659438727b","Type":"ContainerStarted","Data":"c0a4d23b092e2b0f54b4c81e0604d5afd4d1dc7de77db4f65d5f50ca3345499e"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.867122 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" event={"ID":"a2bf1dee-41d8-4797-ae33-0e659438727b","Type":"ContainerStarted","Data":"af78a33f86b096dffb569eb2bd6d458c4004524ec53d2803636625378c8cf385"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.867504 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.869227 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"c712ad4416df43ab1cbc35253a18df07a7105e2701cee2fab8cc1ca4f2e8b641"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.869249 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-xd92c" event={"ID":"3b6479f0-333b-4a96-9adf-2099afdc2447","Type":"ContainerStarted","Data":"858c7c019e00b50a94143510071bfdf84a182d75d58d5bb00fce59c8ba3fb80d"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.869755 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.871446 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"895ba9b9ff9697117cf4ee6145a90f4a99aa504daf6303946247bf9c0faa1c30"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.871494 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-85b44fc459-gdk6g" event={"ID":"5fe485a1-e14f-4c09-b5b9-f252bc42b7e8","Type":"ContainerStarted","Data":"b1d2c05e8fb59152a6279c9f6449bae1b305ff006b3907938c01ab103de67bad"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.874462 4812 generic.go:334] "Generic (PLEG): container finished" podID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerID="d963cad55efc340ffe3867f97e99aa62dace22acfbcc8e929a6aaf2f8f0a019b" exitCode=0 Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.874527 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94kb7" event={"ID":"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441","Type":"ContainerDied","Data":"d963cad55efc340ffe3867f97e99aa62dace22acfbcc8e929a6aaf2f8f0a019b"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.874577 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94kb7" event={"ID":"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441","Type":"ContainerStarted","Data":"33dc858c5df0f26a2c7f43867f43daf311a5ae77a3e3ca06e017d8b19e201714"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.876052 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-55646444c4-trplf" event={"ID":"9d751cbb-f2e2-430d-9754-c882a5e924a5","Type":"ContainerStarted","Data":"f48210133e27a82032a28fdb51fc23d6916980c17483826ca1da4ac2979a755b"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.879225 4812 generic.go:334] "Generic (PLEG): container finished" podID="e97d5bed-3c11-40af-869e-53c260650edb" containerID="f2b32469863c3a03f5bb64d42c151a2c1fe58560d028c1e25b3fb4bf79055b6f" exitCode=0 Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.879266 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerDied","Data":"f2b32469863c3a03f5bb64d42c151a2c1fe58560d028c1e25b3fb4bf79055b6f"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.879295 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerStarted","Data":"a8a6d702db9cf37fd534778f4cb6a300c57b5102151186b3e73f9cba2b4731c8"} Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.907294 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.958370 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.958491 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.973111 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.989210 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-sktpt"] Jan 31 04:29:01 crc kubenswrapper[4812]: I0131 04:29:01.990131 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.008790 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.016165 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" podStartSLOduration=127.016148801 podStartE2EDuration="2m7.016148801s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:02.00913953 +0000 UTC m=+150.504161185" watchObservedRunningTime="2026-01-31 04:29:02.016148801 +0000 UTC m=+150.511170466" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.018715 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sktpt"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.063902 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-utilities\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.064049 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-catalog-content\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.064102 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g5b6g\" (UniqueName: \"kubernetes.io/projected/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-kube-api-access-g5b6g\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.074873 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.075421 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.078312 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.078881 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.089295 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.089509 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.165754 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43492cc6-4f5b-4897-8693-9ddc34420d98-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.165799 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-utilities\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.165824 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43492cc6-4f5b-4897-8693-9ddc34420d98-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.165861 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-catalog-content\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.165912 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g5b6g\" (UniqueName: \"kubernetes.io/projected/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-kube-api-access-g5b6g\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.167084 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-utilities\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.167220 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-catalog-content\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.182671 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g5b6g\" (UniqueName: \"kubernetes.io/projected/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-kube-api-access-g5b6g\") pod \"redhat-marketplace-sktpt\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.227508 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfclc"] Jan 31 04:29:02 crc kubenswrapper[4812]: W0131 04:29:02.234101 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6e79cce_8b4e_491b_a976_a3649e3566cd.slice/crio-1b4f331baad4814be56b4c73a7c5324ad59c5d53c894ac487a2263fedc640413 WatchSource:0}: Error finding container 1b4f331baad4814be56b4c73a7c5324ad59c5d53c894ac487a2263fedc640413: Status 404 returned error can't find the container with id 1b4f331baad4814be56b4c73a7c5324ad59c5d53c894ac487a2263fedc640413 Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.266998 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43492cc6-4f5b-4897-8693-9ddc34420d98-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.267050 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43492cc6-4f5b-4897-8693-9ddc34420d98-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.267113 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43492cc6-4f5b-4897-8693-9ddc34420d98-kubelet-dir\") pod \"revision-pruner-8-crc\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.284725 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43492cc6-4f5b-4897-8693-9ddc34420d98-kube-api-access\") pod \"revision-pruner-8-crc\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.351313 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.357264 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8f668bae-612b-4b75-9490-919e737c6a3b" path="/var/lib/kubelet/pods/8f668bae-612b-4b75-9490-919e737c6a3b/volumes" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.357832 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-controller-manager/revision-pruner-9-crc"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.398185 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.520029 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:02 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:02 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:02 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.520320 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.593673 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-sktpt"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.606509 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-487ln"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.611313 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.613249 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-487ln"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.613655 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.673663 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-catalog-content\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.673745 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qs4cj\" (UniqueName: \"kubernetes.io/projected/4dd847bf-95c7-48c4-9042-f078db7c8438-kube-api-access-qs4cj\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.673781 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-utilities\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.775253 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-catalog-content\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.775327 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qs4cj\" (UniqueName: \"kubernetes.io/projected/4dd847bf-95c7-48c4-9042-f078db7c8438-kube-api-access-qs4cj\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.775360 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-utilities\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.775786 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-catalog-content\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.775903 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-utilities\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.796136 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qs4cj\" (UniqueName: \"kubernetes.io/projected/4dd847bf-95c7-48c4-9042-f078db7c8438-kube-api-access-qs4cj\") pod \"redhat-operators-487ln\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.880889 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.880961 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.881952 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-8-crc"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.888563 4812 patch_prober.go:28] interesting pod/console-f9d7485db-4f2hw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.888604 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4f2hw" podUID="ccd8432b-f254-450e-9b70-e0e89ead504d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.896105 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d5723db-1696-4fe1-a736-756e9bf39115" containerID="5253eb5a05fb0be7a9c86b4254bcd7ed80c9c6440c7c6aed5f218e666951e399" exitCode=0 Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.896166 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" event={"ID":"1d5723db-1696-4fe1-a736-756e9bf39115","Type":"ContainerDied","Data":"5253eb5a05fb0be7a9c86b4254bcd7ed80c9c6440c7c6aed5f218e666951e399"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.905899 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df595fd9-ed01-4a1c-a134-5e79f2bc704c","Type":"ContainerStarted","Data":"82698149ad937d36e494f5f698147abd790cec293a76cb9bbd4da16708bf69bb"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.905945 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df595fd9-ed01-4a1c-a134-5e79f2bc704c","Type":"ContainerStarted","Data":"f9496e4345478fb80e8b5b03a65c4184b555a1ac7b1e8c47607a44404e3cc75e"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.910438 4812 generic.go:334] "Generic (PLEG): container finished" podID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerID="8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62" exitCode=0 Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.910514 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfclc" event={"ID":"d6e79cce-8b4e-491b-a976-a3649e3566cd","Type":"ContainerDied","Data":"8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.910539 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfclc" event={"ID":"d6e79cce-8b4e-491b-a976-a3649e3566cd","Type":"ContainerStarted","Data":"1b4f331baad4814be56b4c73a7c5324ad59c5d53c894ac487a2263fedc640413"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.928306 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-controller-manager/revision-pruner-9-crc" podStartSLOduration=1.928290805 podStartE2EDuration="1.928290805s" podCreationTimestamp="2026-01-31 04:29:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:02.927799981 +0000 UTC m=+151.422821646" watchObservedRunningTime="2026-01-31 04:29:02.928290805 +0000 UTC m=+151.423312470" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.935223 4812 generic.go:334] "Generic (PLEG): container finished" podID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerID="f97096db8185c82904d384f77f0b5fcee905aa11744a2342c654e99169a57ff2" exitCode=0 Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.935309 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sktpt" event={"ID":"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6","Type":"ContainerDied","Data":"f97096db8185c82904d384f77f0b5fcee905aa11744a2342c654e99169a57ff2"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.935585 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sktpt" event={"ID":"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6","Type":"ContainerStarted","Data":"022a89e8489d0306fb9a757b3466fdabae91fc2b08bfba554cf816d64e3a7d87"} Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.971847 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.980046 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-apiserver/apiserver-76f77b778f-tt2tp" Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.992183 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-jtdpt"] Jan 31 04:29:02 crc kubenswrapper[4812]: I0131 04:29:02.993101 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.006747 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtdpt"] Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.018526 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.047311 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.086903 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctmtv\" (UniqueName: \"kubernetes.io/projected/b369d585-140d-46fb-8b27-42f6fdc8817a-kube-api-access-ctmtv\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.087058 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-utilities\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.087124 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-catalog-content\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.187989 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ctmtv\" (UniqueName: \"kubernetes.io/projected/b369d585-140d-46fb-8b27-42f6fdc8817a-kube-api-access-ctmtv\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.188092 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-utilities\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.188135 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-catalog-content\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.188615 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-catalog-content\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.189188 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-utilities\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.251361 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctmtv\" (UniqueName: \"kubernetes.io/projected/b369d585-140d-46fb-8b27-42f6fdc8817a-kube-api-access-ctmtv\") pod \"redhat-operators-jtdpt\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.325999 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.386919 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-487ln"] Jan 31 04:29:03 crc kubenswrapper[4812]: W0131 04:29:03.442743 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4dd847bf_95c7_48c4_9042_f078db7c8438.slice/crio-313226391aedff4e14cbad291285afd18f3e89973bf3b025bab1fbfd16584a3c WatchSource:0}: Error finding container 313226391aedff4e14cbad291285afd18f3e89973bf3b025bab1fbfd16584a3c: Status 404 returned error can't find the container with id 313226391aedff4e14cbad291285afd18f3e89973bf3b025bab1fbfd16584a3c Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.517425 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.538144 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:03 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:03 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:03 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.538184 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.538686 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.538704 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.577424 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.682676 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.716566 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-jtdpt"] Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.802557 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.802601 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.802923 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.802978 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.943595 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43492cc6-4f5b-4897-8693-9ddc34420d98","Type":"ContainerStarted","Data":"8b98cd3d892f56a497152b9437c0214326492f97bc762d5e4a92e7cb2c63e0c4"} Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.943915 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43492cc6-4f5b-4897-8693-9ddc34420d98","Type":"ContainerStarted","Data":"b45f59e7a6b368ca4e1bc9eb2e2ef62d8df56b2e249be41bda272782a1c401e1"} Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.946500 4812 generic.go:334] "Generic (PLEG): container finished" podID="df595fd9-ed01-4a1c-a134-5e79f2bc704c" containerID="82698149ad937d36e494f5f698147abd790cec293a76cb9bbd4da16708bf69bb" exitCode=0 Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.946593 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df595fd9-ed01-4a1c-a134-5e79f2bc704c","Type":"ContainerDied","Data":"82698149ad937d36e494f5f698147abd790cec293a76cb9bbd4da16708bf69bb"} Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.947615 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtdpt" event={"ID":"b369d585-140d-46fb-8b27-42f6fdc8817a","Type":"ContainerStarted","Data":"528cd7e8b53114ff61920038f455c2ce5829e71278b1344bea483e3f4cb9e065"} Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.949143 4812 generic.go:334] "Generic (PLEG): container finished" podID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerID="edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755" exitCode=0 Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.950244 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487ln" event={"ID":"4dd847bf-95c7-48c4-9042-f078db7c8438","Type":"ContainerDied","Data":"edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755"} Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.950281 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487ln" event={"ID":"4dd847bf-95c7-48c4-9042-f078db7c8438","Type":"ContainerStarted","Data":"313226391aedff4e14cbad291285afd18f3e89973bf3b025bab1fbfd16584a3c"} Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.958295 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-oauth-apiserver/apiserver-7bbb656c7d-5gx6v" Jan 31 04:29:03 crc kubenswrapper[4812]: I0131 04:29:03.960494 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/revision-pruner-8-crc" podStartSLOduration=1.96048355 podStartE2EDuration="1.96048355s" podCreationTimestamp="2026-01-31 04:29:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:03.955616407 +0000 UTC m=+152.450638072" watchObservedRunningTime="2026-01-31 04:29:03.96048355 +0000 UTC m=+152.455505205" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.364887 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.413950 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv7fr\" (UniqueName: \"kubernetes.io/projected/1d5723db-1696-4fe1-a736-756e9bf39115-kube-api-access-bv7fr\") pod \"1d5723db-1696-4fe1-a736-756e9bf39115\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.414022 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d5723db-1696-4fe1-a736-756e9bf39115-secret-volume\") pod \"1d5723db-1696-4fe1-a736-756e9bf39115\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.414055 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d5723db-1696-4fe1-a736-756e9bf39115-config-volume\") pod \"1d5723db-1696-4fe1-a736-756e9bf39115\" (UID: \"1d5723db-1696-4fe1-a736-756e9bf39115\") " Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.415052 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1d5723db-1696-4fe1-a736-756e9bf39115-config-volume" (OuterVolumeSpecName: "config-volume") pod "1d5723db-1696-4fe1-a736-756e9bf39115" (UID: "1d5723db-1696-4fe1-a736-756e9bf39115"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.415752 4812 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/1d5723db-1696-4fe1-a736-756e9bf39115-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.518181 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:04 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:04 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:04 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.518230 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.595850 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d5723db-1696-4fe1-a736-756e9bf39115-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "1d5723db-1696-4fe1-a736-756e9bf39115" (UID: "1d5723db-1696-4fe1-a736-756e9bf39115"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.595913 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d5723db-1696-4fe1-a736-756e9bf39115-kube-api-access-bv7fr" (OuterVolumeSpecName: "kube-api-access-bv7fr") pod "1d5723db-1696-4fe1-a736-756e9bf39115" (UID: "1d5723db-1696-4fe1-a736-756e9bf39115"). InnerVolumeSpecName "kube-api-access-bv7fr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.618000 4812 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/1d5723db-1696-4fe1-a736-756e9bf39115-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.618027 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv7fr\" (UniqueName: \"kubernetes.io/projected/1d5723db-1696-4fe1-a736-756e9bf39115-kube-api-access-bv7fr\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.991787 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" event={"ID":"1d5723db-1696-4fe1-a736-756e9bf39115","Type":"ContainerDied","Data":"20b59b73d7eb79077ca13ea5a7dab1655e384b9233daa85ff400b97c65ccd8c3"} Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.991826 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="20b59b73d7eb79077ca13ea5a7dab1655e384b9233daa85ff400b97c65ccd8c3" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.991903 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb" Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.996894 4812 generic.go:334] "Generic (PLEG): container finished" podID="43492cc6-4f5b-4897-8693-9ddc34420d98" containerID="8b98cd3d892f56a497152b9437c0214326492f97bc762d5e4a92e7cb2c63e0c4" exitCode=0 Jan 31 04:29:04 crc kubenswrapper[4812]: I0131 04:29:04.996975 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43492cc6-4f5b-4897-8693-9ddc34420d98","Type":"ContainerDied","Data":"8b98cd3d892f56a497152b9437c0214326492f97bc762d5e4a92e7cb2c63e0c4"} Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.000209 4812 generic.go:334] "Generic (PLEG): container finished" podID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerID="f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270" exitCode=0 Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.000959 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtdpt" event={"ID":"b369d585-140d-46fb-8b27-42f6fdc8817a","Type":"ContainerDied","Data":"f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270"} Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.254432 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.331390 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kubelet-dir\") pod \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.331525 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kube-api-access\") pod \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\" (UID: \"df595fd9-ed01-4a1c-a134-5e79f2bc704c\") " Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.331513 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "df595fd9-ed01-4a1c-a134-5e79f2bc704c" (UID: "df595fd9-ed01-4a1c-a134-5e79f2bc704c"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.331750 4812 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.344443 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "df595fd9-ed01-4a1c-a134-5e79f2bc704c" (UID: "df595fd9-ed01-4a1c-a134-5e79f2bc704c"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.433285 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/df595fd9-ed01-4a1c-a134-5e79f2bc704c-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.517206 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:05 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:05 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:05 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:05 crc kubenswrapper[4812]: I0131 04:29:05.517277 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.040658 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-controller-manager/revision-pruner-9-crc" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.040645 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/revision-pruner-9-crc" event={"ID":"df595fd9-ed01-4a1c-a134-5e79f2bc704c","Type":"ContainerDied","Data":"f9496e4345478fb80e8b5b03a65c4184b555a1ac7b1e8c47607a44404e3cc75e"} Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.040702 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f9496e4345478fb80e8b5b03a65c4184b555a1ac7b1e8c47607a44404e3cc75e" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.383824 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.450454 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43492cc6-4f5b-4897-8693-9ddc34420d98-kube-api-access\") pod \"43492cc6-4f5b-4897-8693-9ddc34420d98\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.450562 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43492cc6-4f5b-4897-8693-9ddc34420d98-kubelet-dir\") pod \"43492cc6-4f5b-4897-8693-9ddc34420d98\" (UID: \"43492cc6-4f5b-4897-8693-9ddc34420d98\") " Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.450820 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/43492cc6-4f5b-4897-8693-9ddc34420d98-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "43492cc6-4f5b-4897-8693-9ddc34420d98" (UID: "43492cc6-4f5b-4897-8693-9ddc34420d98"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.477488 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/43492cc6-4f5b-4897-8693-9ddc34420d98-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "43492cc6-4f5b-4897-8693-9ddc34420d98" (UID: "43492cc6-4f5b-4897-8693-9ddc34420d98"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.518581 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:06 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:06 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:06 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.518630 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.551643 4812 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/43492cc6-4f5b-4897-8693-9ddc34420d98-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:06 crc kubenswrapper[4812]: I0131 04:29:06.551670 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/43492cc6-4f5b-4897-8693-9ddc34420d98-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:07 crc kubenswrapper[4812]: I0131 04:29:07.081993 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-8-crc" event={"ID":"43492cc6-4f5b-4897-8693-9ddc34420d98","Type":"ContainerDied","Data":"b45f59e7a6b368ca4e1bc9eb2e2ef62d8df56b2e249be41bda272782a1c401e1"} Jan 31 04:29:07 crc kubenswrapper[4812]: I0131 04:29:07.082030 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b45f59e7a6b368ca4e1bc9eb2e2ef62d8df56b2e249be41bda272782a1c401e1" Jan 31 04:29:07 crc kubenswrapper[4812]: I0131 04:29:07.082085 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-8-crc" Jan 31 04:29:07 crc kubenswrapper[4812]: I0131 04:29:07.516815 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:07 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:07 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:07 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:07 crc kubenswrapper[4812]: I0131 04:29:07.516911 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:08 crc kubenswrapper[4812]: I0131 04:29:08.517101 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:08 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:08 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:08 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:08 crc kubenswrapper[4812]: I0131 04:29:08.517398 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:08 crc kubenswrapper[4812]: I0131 04:29:08.761111 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-8fkq9" Jan 31 04:29:09 crc kubenswrapper[4812]: I0131 04:29:09.515772 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:09 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:09 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:09 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:09 crc kubenswrapper[4812]: I0131 04:29:09.515861 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:10 crc kubenswrapper[4812]: I0131 04:29:10.516115 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:10 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:10 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:10 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:10 crc kubenswrapper[4812]: I0131 04:29:10.516157 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:11 crc kubenswrapper[4812]: I0131 04:29:11.516552 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:11 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:11 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:11 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:11 crc kubenswrapper[4812]: I0131 04:29:11.516800 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:12 crc kubenswrapper[4812]: I0131 04:29:12.517775 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:12 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:12 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:12 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:12 crc kubenswrapper[4812]: I0131 04:29:12.517920 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:12 crc kubenswrapper[4812]: I0131 04:29:12.877115 4812 patch_prober.go:28] interesting pod/console-f9d7485db-4f2hw container/console namespace/openshift-console: Startup probe status=failure output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" start-of-body= Jan 31 04:29:12 crc kubenswrapper[4812]: I0131 04:29:12.877381 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-console/console-f9d7485db-4f2hw" podUID="ccd8432b-f254-450e-9b70-e0e89ead504d" containerName="console" probeResult="failure" output="Get \"https://10.217.0.19:8443/health\": dial tcp 10.217.0.19:8443: connect: connection refused" Jan 31 04:29:13 crc kubenswrapper[4812]: I0131 04:29:13.517154 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:13 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:13 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:13 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:13 crc kubenswrapper[4812]: I0131 04:29:13.517221 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:13 crc kubenswrapper[4812]: I0131 04:29:13.797600 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Liveness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:29:13 crc kubenswrapper[4812]: I0131 04:29:13.797640 4812 patch_prober.go:28] interesting pod/downloads-7954f5f757-plj5m container/download-server namespace/openshift-console: Readiness probe status=failure output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" start-of-body= Jan 31 04:29:13 crc kubenswrapper[4812]: I0131 04:29:13.797694 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:29:13 crc kubenswrapper[4812]: I0131 04:29:13.797746 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-console/downloads-7954f5f757-plj5m" podUID="b6b2d935-b2ef-4444-9c39-2c91695b9765" containerName="download-server" probeResult="failure" output="Get \"http://10.217.0.14:8080/\": dial tcp 10.217.0.14:8080: connect: connection refused" Jan 31 04:29:14 crc kubenswrapper[4812]: I0131 04:29:14.338786 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:29:14 crc kubenswrapper[4812]: I0131 04:29:14.338922 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:29:14 crc kubenswrapper[4812]: I0131 04:29:14.517034 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:14 crc kubenswrapper[4812]: [-]has-synced failed: reason withheld Jan 31 04:29:14 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:14 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:14 crc kubenswrapper[4812]: I0131 04:29:14.517119 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:15 crc kubenswrapper[4812]: I0131 04:29:15.521334 4812 patch_prober.go:28] interesting pod/router-default-5444994796-2hcvc container/router namespace/openshift-ingress: Startup probe status=failure output="HTTP probe failed with statuscode: 500" start-of-body=[-]backend-http failed: reason withheld Jan 31 04:29:15 crc kubenswrapper[4812]: [+]has-synced ok Jan 31 04:29:15 crc kubenswrapper[4812]: [+]process-running ok Jan 31 04:29:15 crc kubenswrapper[4812]: healthz check failed Jan 31 04:29:15 crc kubenswrapper[4812]: I0131 04:29:15.521417 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-ingress/router-default-5444994796-2hcvc" podUID="954336e2-74fe-443b-9bef-1247a8935c13" containerName="router" probeResult="failure" output="HTTP probe failed with statuscode: 500" Jan 31 04:29:16 crc kubenswrapper[4812]: I0131 04:29:16.517750 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:29:16 crc kubenswrapper[4812]: I0131 04:29:16.520647 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-5444994796-2hcvc" Jan 31 04:29:17 crc kubenswrapper[4812]: I0131 04:29:17.416828 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6246k"] Jan 31 04:29:17 crc kubenswrapper[4812]: I0131 04:29:17.417157 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" containerID="cri-o://098748da72a159c8b96bdbc0c778d5a309cbf23b968142e7da6edac15584c5ea" gracePeriod=30 Jan 31 04:29:17 crc kubenswrapper[4812]: I0131 04:29:17.446399 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5"] Jan 31 04:29:17 crc kubenswrapper[4812]: I0131 04:29:17.446805 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" containerID="cri-o://4a4206acc74d2aa313b8a73c44866828cd2c524693f349440e8a298dac61b225" gracePeriod=30 Jan 31 04:29:18 crc kubenswrapper[4812]: I0131 04:29:18.152309 4812 generic.go:334] "Generic (PLEG): container finished" podID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerID="4a4206acc74d2aa313b8a73c44866828cd2c524693f349440e8a298dac61b225" exitCode=0 Jan 31 04:29:18 crc kubenswrapper[4812]: I0131 04:29:18.152453 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" event={"ID":"71bda3a8-4993-48f0-abaf-300a04380ac7","Type":"ContainerDied","Data":"4a4206acc74d2aa313b8a73c44866828cd2c524693f349440e8a298dac61b225"} Jan 31 04:29:18 crc kubenswrapper[4812]: I0131 04:29:18.154192 4812 generic.go:334] "Generic (PLEG): container finished" podID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerID="098748da72a159c8b96bdbc0c778d5a309cbf23b968142e7da6edac15584c5ea" exitCode=0 Jan 31 04:29:18 crc kubenswrapper[4812]: I0131 04:29:18.154239 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" event={"ID":"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc","Type":"ContainerDied","Data":"098748da72a159c8b96bdbc0c778d5a309cbf23b968142e7da6edac15584c5ea"} Jan 31 04:29:18 crc kubenswrapper[4812]: I0131 04:29:18.753738 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:29:18 crc kubenswrapper[4812]: I0131 04:29:18.776921 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/2c369253-313a-484c-bc8a-dae99abab086-metrics-certs\") pod \"network-metrics-daemon-wg68w\" (UID: \"2c369253-313a-484c-bc8a-dae99abab086\") " pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:29:19 crc kubenswrapper[4812]: I0131 04:29:19.063116 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-wg68w" Jan 31 04:29:21 crc kubenswrapper[4812]: I0131 04:29:21.337785 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:29:22 crc kubenswrapper[4812]: I0131 04:29:22.894486 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:29:22 crc kubenswrapper[4812]: I0131 04:29:22.901957 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-f9d7485db-4f2hw" Jan 31 04:29:22 crc kubenswrapper[4812]: I0131 04:29:22.982963 4812 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6246k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" start-of-body= Jan 31 04:29:22 crc kubenswrapper[4812]: I0131 04:29:22.983042 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": dial tcp 10.217.0.6:8443: connect: connection refused" Jan 31 04:29:23 crc kubenswrapper[4812]: I0131 04:29:23.015963 4812 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-wb7l5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" start-of-body= Jan 31 04:29:23 crc kubenswrapper[4812]: I0131 04:29:23.016050 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: connect: connection refused" Jan 31 04:29:23 crc kubenswrapper[4812]: I0131 04:29:23.806767 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-7954f5f757-plj5m" Jan 31 04:29:33 crc kubenswrapper[4812]: I0131 04:29:33.036238 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operator-lifecycle-manager/package-server-manager-789f6589d5-d4bjq" Jan 31 04:29:33 crc kubenswrapper[4812]: I0131 04:29:33.998858 4812 patch_prober.go:28] interesting pod/controller-manager-879f6c89f-6246k container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" start-of-body= Jan 31 04:29:33 crc kubenswrapper[4812]: I0131 04:29:33.998975 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.6:8443/healthz\": net/http: request canceled while waiting for connection (Client.Timeout exceeded while awaiting headers)" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.015137 4812 patch_prober.go:28] interesting pod/route-controller-manager-6576b87f9c-wb7l5 container/route-controller-manager namespace/openshift-route-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: i/o timeout" start-of-body= Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.015194 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" probeResult="failure" output="Get \"https://10.217.0.16:8443/healthz\": dial tcp 10.217.0.16:8443: i/o timeout" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.858836 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.867974 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.889300 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr"] Jan 31 04:29:34 crc kubenswrapper[4812]: E0131 04:29:34.889600 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="43492cc6-4f5b-4897-8693-9ddc34420d98" containerName="pruner" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.889629 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="43492cc6-4f5b-4897-8693-9ddc34420d98" containerName="pruner" Jan 31 04:29:34 crc kubenswrapper[4812]: E0131 04:29:34.889650 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.889663 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" Jan 31 04:29:34 crc kubenswrapper[4812]: E0131 04:29:34.889678 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.889717 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" Jan 31 04:29:34 crc kubenswrapper[4812]: E0131 04:29:34.889745 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d5723db-1696-4fe1-a736-756e9bf39115" containerName="collect-profiles" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.889758 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d5723db-1696-4fe1-a736-756e9bf39115" containerName="collect-profiles" Jan 31 04:29:34 crc kubenswrapper[4812]: E0131 04:29:34.889783 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="df595fd9-ed01-4a1c-a134-5e79f2bc704c" containerName="pruner" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.889795 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="df595fd9-ed01-4a1c-a134-5e79f2bc704c" containerName="pruner" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.890181 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" containerName="controller-manager" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.890259 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d5723db-1696-4fe1-a736-756e9bf39115" containerName="collect-profiles" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.890277 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="43492cc6-4f5b-4897-8693-9ddc34420d98" containerName="pruner" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.890294 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="df595fd9-ed01-4a1c-a134-5e79f2bc704c" containerName="pruner" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.890312 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" containerName="route-controller-manager" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.891226 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.919234 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr"] Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970482 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqxxp\" (UniqueName: \"kubernetes.io/projected/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-kube-api-access-kqxxp\") pod \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970528 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-serving-cert\") pod \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970592 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bda3a8-4993-48f0-abaf-300a04380ac7-serving-cert\") pod \"71bda3a8-4993-48f0-abaf-300a04380ac7\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970647 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tvt6l\" (UniqueName: \"kubernetes.io/projected/71bda3a8-4993-48f0-abaf-300a04380ac7-kube-api-access-tvt6l\") pod \"71bda3a8-4993-48f0-abaf-300a04380ac7\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970696 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-client-ca\") pod \"71bda3a8-4993-48f0-abaf-300a04380ac7\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970733 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-proxy-ca-bundles\") pod \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970768 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-config\") pod \"71bda3a8-4993-48f0-abaf-300a04380ac7\" (UID: \"71bda3a8-4993-48f0-abaf-300a04380ac7\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970820 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-client-ca\") pod \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.970924 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-config\") pod \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\" (UID: \"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc\") " Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971145 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-proxy-ca-bundles\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971192 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-client-ca\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971228 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbhm7\" (UniqueName: \"kubernetes.io/projected/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-kube-api-access-lbhm7\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971271 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-serving-cert\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971300 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-config\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971410 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-client-ca" (OuterVolumeSpecName: "client-ca") pod "71bda3a8-4993-48f0-abaf-300a04380ac7" (UID: "71bda3a8-4993-48f0-abaf-300a04380ac7"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971455 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-config" (OuterVolumeSpecName: "config") pod "71bda3a8-4993-48f0-abaf-300a04380ac7" (UID: "71bda3a8-4993-48f0-abaf-300a04380ac7"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.971800 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-client-ca" (OuterVolumeSpecName: "client-ca") pod "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" (UID: "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.972179 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" (UID: "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.972187 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-config" (OuterVolumeSpecName: "config") pod "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" (UID: "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.977506 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-kube-api-access-kqxxp" (OuterVolumeSpecName: "kube-api-access-kqxxp") pod "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" (UID: "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc"). InnerVolumeSpecName "kube-api-access-kqxxp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.977875 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/71bda3a8-4993-48f0-abaf-300a04380ac7-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "71bda3a8-4993-48f0-abaf-300a04380ac7" (UID: "71bda3a8-4993-48f0-abaf-300a04380ac7"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.983214 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" (UID: "e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:29:34 crc kubenswrapper[4812]: I0131 04:29:34.983425 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/71bda3a8-4993-48f0-abaf-300a04380ac7-kube-api-access-tvt6l" (OuterVolumeSpecName: "kube-api-access-tvt6l") pod "71bda3a8-4993-48f0-abaf-300a04380ac7" (UID: "71bda3a8-4993-48f0-abaf-300a04380ac7"). InnerVolumeSpecName "kube-api-access-tvt6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.072909 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-serving-cert\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.072987 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-config\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073094 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-proxy-ca-bundles\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073147 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-client-ca\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073205 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbhm7\" (UniqueName: \"kubernetes.io/projected/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-kube-api-access-lbhm7\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073292 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073316 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073338 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kqxxp\" (UniqueName: \"kubernetes.io/projected/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-kube-api-access-kqxxp\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073360 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073379 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/71bda3a8-4993-48f0-abaf-300a04380ac7-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073398 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tvt6l\" (UniqueName: \"kubernetes.io/projected/71bda3a8-4993-48f0-abaf-300a04380ac7-kube-api-access-tvt6l\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073416 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073433 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.073452 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/71bda3a8-4993-48f0-abaf-300a04380ac7-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.074796 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-client-ca\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.075380 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-proxy-ca-bundles\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.075755 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-config\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.077270 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-serving-cert\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.090979 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbhm7\" (UniqueName: \"kubernetes.io/projected/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-kube-api-access-lbhm7\") pod \"controller-manager-f7d9f84f5-ls8vr\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.224518 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.268592 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.269707 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5" event={"ID":"71bda3a8-4993-48f0-abaf-300a04380ac7","Type":"ContainerDied","Data":"684baeead2aad15b37ed7c9d09a0a151caf98eb8d5f72c39369cd17c08f8ce9a"} Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.269785 4812 scope.go:117] "RemoveContainer" containerID="4a4206acc74d2aa313b8a73c44866828cd2c524693f349440e8a298dac61b225" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.276385 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" event={"ID":"e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc","Type":"ContainerDied","Data":"dc3fdfaea7df7c57524a86a94e6b6db75fca34a88b78488a9db9592d2e6f8e7d"} Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.276474 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-879f6c89f-6246k" Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.324080 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6246k"] Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.330748 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-879f6c89f-6246k"] Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.338317 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5"] Jan 31 04:29:35 crc kubenswrapper[4812]: I0131 04:29:35.344036 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-6576b87f9c-wb7l5"] Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.351269 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="71bda3a8-4993-48f0-abaf-300a04380ac7" path="/var/lib/kubelet/pods/71bda3a8-4993-48f0-abaf-300a04380ac7/volumes" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.352410 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc" path="/var/lib/kubelet/pods/e39c1ba0-4fde-4b6d-b9e3-72e77f2061bc/volumes" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.928322 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85"] Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.929072 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.931860 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.932038 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.932298 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.932648 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.932812 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.932982 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 04:29:36 crc kubenswrapper[4812]: I0131 04:29:36.940881 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85"] Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.107365 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-client-ca\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.107739 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-config\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.107761 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22b629d2-4035-41be-9ef1-bb74508929eb-serving-cert\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.107794 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n98nb\" (UniqueName: \"kubernetes.io/projected/22b629d2-4035-41be-9ef1-bb74508929eb-kube-api-access-n98nb\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.212291 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-n98nb\" (UniqueName: \"kubernetes.io/projected/22b629d2-4035-41be-9ef1-bb74508929eb-kube-api-access-n98nb\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.212465 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-client-ca\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.212512 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-config\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.212544 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22b629d2-4035-41be-9ef1-bb74508929eb-serving-cert\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.214516 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-config\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.214886 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-client-ca\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.219220 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22b629d2-4035-41be-9ef1-bb74508929eb-serving-cert\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.231112 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-n98nb\" (UniqueName: \"kubernetes.io/projected/22b629d2-4035-41be-9ef1-bb74508929eb-kube-api-access-n98nb\") pod \"route-controller-manager-7fb457f777-k4d85\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.254026 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.356453 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr"] Jan 31 04:29:37 crc kubenswrapper[4812]: I0131 04:29:37.463404 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85"] Jan 31 04:29:38 crc kubenswrapper[4812]: E0131 04:29:38.928678 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 04:29:38 crc kubenswrapper[4812]: E0131 04:29:38.929084 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-8ffw8,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-94kb7_openshift-marketplace(7cfc040d-5e3a-4ee4-a72d-c67c8d51d441): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:38 crc kubenswrapper[4812]: E0131 04:29:38.930254 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-94kb7" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" Jan 31 04:29:40 crc kubenswrapper[4812]: I0131 04:29:40.376039 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-xd92c" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.072296 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.073687 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.078092 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver"/"kube-root-ca.crt" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.078691 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver"/"installer-sa-dockercfg-5pr6n" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.102016 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.171386 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.171474 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.272513 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.272592 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.272664 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kubelet-dir\") pod \"revision-pruner-9-crc\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.292718 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kube-api-access\") pod \"revision-pruner-9-crc\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:41 crc kubenswrapper[4812]: I0131 04:29:41.416978 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:43 crc kubenswrapper[4812]: E0131 04:29:43.442817 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 04:29:43 crc kubenswrapper[4812]: E0131 04:29:43.443036 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-h8dn5,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-2z5tv_openshift-marketplace(e97d5bed-3c11-40af-869e-53c260650edb): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:43 crc kubenswrapper[4812]: E0131 04:29:43.444227 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-2z5tv" podUID="e97d5bed-3c11-40af-869e-53c260650edb" Jan 31 04:29:44 crc kubenswrapper[4812]: I0131 04:29:44.338261 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:29:44 crc kubenswrapper[4812]: I0131 04:29:44.338617 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.665165 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.666282 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.677638 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.826710 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-kubelet-dir\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.826952 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550755c5-af9b-446d-a3ad-d5afc6264e89-kube-api-access\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.827047 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-var-lock\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.927935 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-kubelet-dir\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.928034 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550755c5-af9b-446d-a3ad-d5afc6264e89-kube-api-access\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.928048 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-kubelet-dir\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.928081 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-var-lock\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.928124 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-var-lock\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.948681 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550755c5-af9b-446d-a3ad-d5afc6264e89-kube-api-access\") pod \"installer-9-crc\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:45 crc kubenswrapper[4812]: I0131 04:29:45.992978 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:29:47 crc kubenswrapper[4812]: E0131 04:29:47.527556 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-94kb7" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" Jan 31 04:29:47 crc kubenswrapper[4812]: E0131 04:29:47.528467 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-2z5tv" podUID="e97d5bed-3c11-40af-869e-53c260650edb" Jan 31 04:29:47 crc kubenswrapper[4812]: E0131 04:29:47.703731 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 04:29:47 crc kubenswrapper[4812]: E0131 04:29:47.703994 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-qs4cj,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-487ln_openshift-marketplace(4dd847bf-95c7-48c4-9042-f078db7c8438): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:47 crc kubenswrapper[4812]: E0131 04:29:47.705264 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-487ln" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.517563 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-487ln" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" Jan 31 04:29:50 crc kubenswrapper[4812]: I0131 04:29:50.574115 4812 scope.go:117] "RemoveContainer" containerID="098748da72a159c8b96bdbc0c778d5a309cbf23b968142e7da6edac15584c5ea" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.740541 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.740944 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-w8v6l,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-bfclc_openshift-marketplace(d6e79cce-8b4e-491b-a976-a3649e3566cd): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.742112 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-bfclc" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.834879 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-operator-index:v4.18" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.835005 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-ctmtv,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-operators-jtdpt_openshift-marketplace(b369d585-140d-46fb-8b27-42f6fdc8817a): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:50 crc kubenswrapper[4812]: E0131 04:29:50.836198 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-operators-jtdpt" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.025758 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/installer-9-crc"] Jan 31 04:29:51 crc kubenswrapper[4812]: W0131 04:29:51.039336 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod550755c5_af9b_446d_a3ad_d5afc6264e89.slice/crio-b9b0daa9efe5b7b2b4104f6a9373ad2c27ca8fae639cbad9a4a033af0b560b81 WatchSource:0}: Error finding container b9b0daa9efe5b7b2b4104f6a9373ad2c27ca8fae639cbad9a4a033af0b560b81: Status 404 returned error can't find the container with id b9b0daa9efe5b7b2b4104f6a9373ad2c27ca8fae639cbad9a4a033af0b560b81 Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.072453 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-kube-apiserver/revision-pruner-9-crc"] Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.087499 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr"] Jan 31 04:29:51 crc kubenswrapper[4812]: W0131 04:29:51.116788 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc61f916_6212_4fb9_9a7b_25dd3dd61dfa.slice/crio-0cdb504b13689a968367da69f369f317fcd80b9c5a88da08d893464886860b94 WatchSource:0}: Error finding container 0cdb504b13689a968367da69f369f317fcd80b9c5a88da08d893464886860b94: Status 404 returned error can't find the container with id 0cdb504b13689a968367da69f369f317fcd80b9c5a88da08d893464886860b94 Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.132151 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85"] Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.134411 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/community-operator-index:v4.18" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.134568 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/community-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-7tx8z,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod community-operators-56x2d_openshift-marketplace(a16a82f9-4289-4749-bc62-df59dacefac1): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.136367 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/community-operators-56x2d" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.142723 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-wg68w"] Jan 31 04:29:51 crc kubenswrapper[4812]: W0131 04:29:51.147199 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod22b629d2_4035_41be_9ef1_bb74508929eb.slice/crio-562bd1688e18db44ada69b98eedb855630f528ff7f468adc2a20f7c69e9c1101 WatchSource:0}: Error finding container 562bd1688e18db44ada69b98eedb855630f528ff7f468adc2a20f7c69e9c1101: Status 404 returned error can't find the container with id 562bd1688e18db44ada69b98eedb855630f528ff7f468adc2a20f7c69e9c1101 Jan 31 04:29:51 crc kubenswrapper[4812]: W0131 04:29:51.149743 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2c369253_313a_484c_bc8a_dae99abab086.slice/crio-ce58c2d4371a4d62cd94445845e7476a036f1a8fcda668e0fd8242488a6820fe WatchSource:0}: Error finding container ce58c2d4371a4d62cd94445845e7476a036f1a8fcda668e0fd8242488a6820fe: Status 404 returned error can't find the container with id ce58c2d4371a4d62cd94445845e7476a036f1a8fcda668e0fd8242488a6820fe Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.361269 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/redhat-marketplace-index:v4.18" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.361579 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/redhat-marketplace-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-g5b6g,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod redhat-marketplace-sktpt_openshift-marketplace(e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.364697 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/redhat-marketplace-sktpt" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.381237 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7","Type":"ContainerStarted","Data":"45bfd5a2cf45e0cff412fdeec21b7b5be6b21afad27109c142bf365d7ef98545"} Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.386061 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" event={"ID":"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa","Type":"ContainerStarted","Data":"0cdb504b13689a968367da69f369f317fcd80b9c5a88da08d893464886860b94"} Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.387973 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wg68w" event={"ID":"2c369253-313a-484c-bc8a-dae99abab086","Type":"ContainerStarted","Data":"ce58c2d4371a4d62cd94445845e7476a036f1a8fcda668e0fd8242488a6820fe"} Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.388856 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"550755c5-af9b-446d-a3ad-d5afc6264e89","Type":"ContainerStarted","Data":"b9b0daa9efe5b7b2b4104f6a9373ad2c27ca8fae639cbad9a4a033af0b560b81"} Jan 31 04:29:51 crc kubenswrapper[4812]: I0131 04:29:51.391265 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" event={"ID":"22b629d2-4035-41be-9ef1-bb74508929eb","Type":"ContainerStarted","Data":"562bd1688e18db44ada69b98eedb855630f528ff7f468adc2a20f7c69e9c1101"} Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.401359 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-bfclc" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.401784 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/community-operator-index:v4.18\\\"\"" pod="openshift-marketplace/community-operators-56x2d" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.403455 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-operator-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-operators-jtdpt" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" Jan 31 04:29:51 crc kubenswrapper[4812]: E0131 04:29:51.412709 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/redhat-marketplace-index:v4.18\\\"\"" pod="openshift-marketplace/redhat-marketplace-sktpt" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" Jan 31 04:29:52 crc kubenswrapper[4812]: I0131 04:29:52.398474 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" event={"ID":"22b629d2-4035-41be-9ef1-bb74508929eb","Type":"ContainerStarted","Data":"aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b"} Jan 31 04:29:52 crc kubenswrapper[4812]: I0131 04:29:52.400415 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7","Type":"ContainerStarted","Data":"cec7d26a5465ec774158c5a3aa9c83fa984f38911c01bc18bd9854a3739697af"} Jan 31 04:29:52 crc kubenswrapper[4812]: I0131 04:29:52.402251 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" event={"ID":"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa","Type":"ContainerStarted","Data":"6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2"} Jan 31 04:29:53 crc kubenswrapper[4812]: E0131 04:29:53.214700 4812 log.go:32] "PullImage from image service failed" err="rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" image="registry.redhat.io/redhat/certified-operator-index:v4.18" Jan 31 04:29:53 crc kubenswrapper[4812]: E0131 04:29:53.215324 4812 kuberuntime_manager.go:1274] "Unhandled Error" err="init container &Container{Name:extract-content,Image:registry.redhat.io/redhat/certified-operator-index:v4.18,Command:[/utilities/copy-content],Args:[--catalog.from=/configs --catalog.to=/extracted-catalog/catalog --cache.from=/tmp/cache --cache.to=/extracted-catalog/cache],WorkingDir:,Ports:[]ContainerPort{},Env:[]EnvVar{},Resources:ResourceRequirements{Limits:ResourceList{},Requests:ResourceList{},Claims:[]ResourceClaim{},},VolumeMounts:[]VolumeMount{VolumeMount{Name:utilities,ReadOnly:false,MountPath:/utilities,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:catalog-content,ReadOnly:false,MountPath:/extracted-catalog,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},VolumeMount{Name:kube-api-access-2t89x,ReadOnly:true,MountPath:/var/run/secrets/kubernetes.io/serviceaccount,SubPath:,MountPropagation:nil,SubPathExpr:,RecursiveReadOnly:nil,},},LivenessProbe:nil,ReadinessProbe:nil,Lifecycle:nil,TerminationMessagePath:/dev/termination-log,ImagePullPolicy:Always,SecurityContext:&SecurityContext{Capabilities:&Capabilities{Add:[],Drop:[ALL],},Privileged:nil,SELinuxOptions:nil,RunAsUser:*1000170000,RunAsNonRoot:*true,ReadOnlyRootFilesystem:nil,AllowPrivilegeEscalation:*false,RunAsGroup:nil,ProcMount:nil,WindowsOptions:nil,SeccompProfile:nil,AppArmorProfile:nil,},Stdin:false,StdinOnce:false,TTY:false,EnvFrom:[]EnvFromSource{},TerminationMessagePolicy:FallbackToLogsOnError,VolumeDevices:[]VolumeDevice{},StartupProbe:nil,ResizePolicy:[]ContainerResizePolicy{},RestartPolicy:nil,} start failed in pod certified-operators-sj99w_openshift-marketplace(d9c1a0d3-b881-4382-89c4-905ad455a360): ErrImagePull: rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled" logger="UnhandledError" Jan 31 04:29:53 crc kubenswrapper[4812]: E0131 04:29:53.216946 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ErrImagePull: \"rpc error: code = Canceled desc = copying system image from manifest list: copying config: context canceled\"" pod="openshift-marketplace/certified-operators-sj99w" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.412954 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"550755c5-af9b-446d-a3ad-d5afc6264e89","Type":"ContainerStarted","Data":"693bed728ba21f37217257e0a7b27a629eb1be4e6d2d5395a94bba1c8cb0122b"} Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.420324 4812 generic.go:334] "Generic (PLEG): container finished" podID="600ff6fe-2bd7-485b-851a-5b3f9b1c37c7" containerID="cec7d26a5465ec774158c5a3aa9c83fa984f38911c01bc18bd9854a3739697af" exitCode=0 Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.420426 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7","Type":"ContainerDied","Data":"cec7d26a5465ec774158c5a3aa9c83fa984f38911c01bc18bd9854a3739697af"} Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.425253 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" podUID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" containerName="controller-manager" containerID="cri-o://6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2" gracePeriod=30 Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.425333 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wg68w" event={"ID":"2c369253-313a-484c-bc8a-dae99abab086","Type":"ContainerStarted","Data":"1f912ed01607d6117e85624bc9adc8828783e28c815309c3bae6a6368e300eff"} Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.425896 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" podUID="22b629d2-4035-41be-9ef1-bb74508929eb" containerName="route-controller-manager" containerID="cri-o://aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b" gracePeriod=30 Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.426172 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:53 crc kubenswrapper[4812]: E0131 04:29:53.430331 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"extract-content\" with ImagePullBackOff: \"Back-off pulling image \\\"registry.redhat.io/redhat/certified-operator-index:v4.18\\\"\"" pod="openshift-marketplace/certified-operators-sj99w" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.442311 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/installer-9-crc" podStartSLOduration=8.441225692 podStartE2EDuration="8.441225692s" podCreationTimestamp="2026-01-31 04:29:45 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:53.436698237 +0000 UTC m=+201.931719922" watchObservedRunningTime="2026-01-31 04:29:53.441225692 +0000 UTC m=+201.936247397" Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.447550 4812 patch_prober.go:28] interesting pod/controller-manager-f7d9f84f5-ls8vr container/controller-manager namespace/openshift-controller-manager: Readiness probe status=failure output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:54010->10.217.0.54:8443: read: connection reset by peer" start-of-body= Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.447625 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" podUID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" containerName="controller-manager" probeResult="failure" output="Get \"https://10.217.0.54:8443/healthz\": read tcp 10.217.0.2:54010->10.217.0.54:8443: read: connection reset by peer" Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.494312 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" podStartSLOduration=36.49429392 podStartE2EDuration="36.49429392s" podCreationTimestamp="2026-01-31 04:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:53.49210162 +0000 UTC m=+201.987123315" watchObservedRunningTime="2026-01-31 04:29:53.49429392 +0000 UTC m=+201.989315595" Jan 31 04:29:53 crc kubenswrapper[4812]: I0131 04:29:53.558412 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" podStartSLOduration=36.558390993 podStartE2EDuration="36.558390993s" podCreationTimestamp="2026-01-31 04:29:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:53.536454946 +0000 UTC m=+202.031476611" watchObservedRunningTime="2026-01-31 04:29:53.558390993 +0000 UTC m=+202.053412668" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.356684 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.359569 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.398361 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp"] Jan 31 04:29:54 crc kubenswrapper[4812]: E0131 04:29:54.398621 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" containerName="controller-manager" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.398636 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" containerName="controller-manager" Jan 31 04:29:54 crc kubenswrapper[4812]: E0131 04:29:54.398647 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="22b629d2-4035-41be-9ef1-bb74508929eb" containerName="route-controller-manager" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.398655 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="22b629d2-4035-41be-9ef1-bb74508929eb" containerName="route-controller-manager" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.398791 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" containerName="controller-manager" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.398808 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="22b629d2-4035-41be-9ef1-bb74508929eb" containerName="route-controller-manager" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.399333 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.405960 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp"] Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.433019 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-wg68w" event={"ID":"2c369253-313a-484c-bc8a-dae99abab086","Type":"ContainerStarted","Data":"4a4ded909274b849e689dc288bed31910e87e0bbe64836b6d212e7432a6513b2"} Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.434875 4812 generic.go:334] "Generic (PLEG): container finished" podID="22b629d2-4035-41be-9ef1-bb74508929eb" containerID="aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b" exitCode=0 Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.434946 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" event={"ID":"22b629d2-4035-41be-9ef1-bb74508929eb","Type":"ContainerDied","Data":"aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b"} Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.434961 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.434977 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85" event={"ID":"22b629d2-4035-41be-9ef1-bb74508929eb","Type":"ContainerDied","Data":"562bd1688e18db44ada69b98eedb855630f528ff7f468adc2a20f7c69e9c1101"} Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.434996 4812 scope.go:117] "RemoveContainer" containerID="aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.437017 4812 generic.go:334] "Generic (PLEG): container finished" podID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" containerID="6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2" exitCode=0 Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.437081 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" event={"ID":"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa","Type":"ContainerDied","Data":"6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2"} Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.437117 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" event={"ID":"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa","Type":"ContainerDied","Data":"0cdb504b13689a968367da69f369f317fcd80b9c5a88da08d893464886860b94"} Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.437164 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.451215 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-wg68w" podStartSLOduration=179.451190016 podStartE2EDuration="2m59.451190016s" podCreationTimestamp="2026-01-31 04:26:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:54.449326883 +0000 UTC m=+202.944348558" watchObservedRunningTime="2026-01-31 04:29:54.451190016 +0000 UTC m=+202.946211711" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457151 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbhm7\" (UniqueName: \"kubernetes.io/projected/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-kube-api-access-lbhm7\") pod \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457198 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-serving-cert\") pod \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457238 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-client-ca\") pod \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457265 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22b629d2-4035-41be-9ef1-bb74508929eb-serving-cert\") pod \"22b629d2-4035-41be-9ef1-bb74508929eb\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457296 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-config\") pod \"22b629d2-4035-41be-9ef1-bb74508929eb\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457326 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-proxy-ca-bundles\") pod \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457358 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-n98nb\" (UniqueName: \"kubernetes.io/projected/22b629d2-4035-41be-9ef1-bb74508929eb-kube-api-access-n98nb\") pod \"22b629d2-4035-41be-9ef1-bb74508929eb\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457382 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-client-ca\") pod \"22b629d2-4035-41be-9ef1-bb74508929eb\" (UID: \"22b629d2-4035-41be-9ef1-bb74508929eb\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457399 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-config\") pod \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\" (UID: \"fc61f916-6212-4fb9-9a7b-25dd3dd61dfa\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457766 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71f913-19f1-4c11-b85c-4a4471dff4df-serving-cert\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457824 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zswsr\" (UniqueName: \"kubernetes.io/projected/bf71f913-19f1-4c11-b85c-4a4471dff4df-kube-api-access-zswsr\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457870 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-client-ca\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.457894 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-config\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.460681 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-config" (OuterVolumeSpecName: "config") pod "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" (UID: "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.461610 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-config" (OuterVolumeSpecName: "config") pod "22b629d2-4035-41be-9ef1-bb74508929eb" (UID: "22b629d2-4035-41be-9ef1-bb74508929eb"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.462107 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-client-ca" (OuterVolumeSpecName: "client-ca") pod "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" (UID: "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.463569 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" (UID: "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.463873 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-client-ca" (OuterVolumeSpecName: "client-ca") pod "22b629d2-4035-41be-9ef1-bb74508929eb" (UID: "22b629d2-4035-41be-9ef1-bb74508929eb"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.466420 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-kube-api-access-lbhm7" (OuterVolumeSpecName: "kube-api-access-lbhm7") pod "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" (UID: "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa"). InnerVolumeSpecName "kube-api-access-lbhm7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.472055 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/22b629d2-4035-41be-9ef1-bb74508929eb-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "22b629d2-4035-41be-9ef1-bb74508929eb" (UID: "22b629d2-4035-41be-9ef1-bb74508929eb"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.478832 4812 scope.go:117] "RemoveContainer" containerID="aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b" Jan 31 04:29:54 crc kubenswrapper[4812]: E0131 04:29:54.479329 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b\": container with ID starting with aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b not found: ID does not exist" containerID="aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.479367 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b"} err="failed to get container status \"aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b\": rpc error: code = NotFound desc = could not find container \"aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b\": container with ID starting with aedff67c43697d8acf567e98128d972c860dccebd93ebb3104b1ae6830562f9b not found: ID does not exist" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.479507 4812 scope.go:117] "RemoveContainer" containerID="6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.480474 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/22b629d2-4035-41be-9ef1-bb74508929eb-kube-api-access-n98nb" (OuterVolumeSpecName: "kube-api-access-n98nb") pod "22b629d2-4035-41be-9ef1-bb74508929eb" (UID: "22b629d2-4035-41be-9ef1-bb74508929eb"). InnerVolumeSpecName "kube-api-access-n98nb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.481953 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" (UID: "fc61f916-6212-4fb9-9a7b-25dd3dd61dfa"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.508561 4812 scope.go:117] "RemoveContainer" containerID="6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2" Jan 31 04:29:54 crc kubenswrapper[4812]: E0131 04:29:54.509469 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2\": container with ID starting with 6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2 not found: ID does not exist" containerID="6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.509514 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2"} err="failed to get container status \"6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2\": rpc error: code = NotFound desc = could not find container \"6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2\": container with ID starting with 6477cf0f0865ea30a6217268b5907edad7121139483cc6029b1ea443fdfa92e2 not found: ID does not exist" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.559596 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-zswsr\" (UniqueName: \"kubernetes.io/projected/bf71f913-19f1-4c11-b85c-4a4471dff4df-kube-api-access-zswsr\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.559673 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-client-ca\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.559704 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-config\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.559755 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71f913-19f1-4c11-b85c-4a4471dff4df-serving-cert\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.559810 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560133 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560151 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-n98nb\" (UniqueName: \"kubernetes.io/projected/22b629d2-4035-41be-9ef1-bb74508929eb-kube-api-access-n98nb\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560161 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560170 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/22b629d2-4035-41be-9ef1-bb74508929eb-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560183 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbhm7\" (UniqueName: \"kubernetes.io/projected/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-kube-api-access-lbhm7\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560194 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560205 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.560220 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/22b629d2-4035-41be-9ef1-bb74508929eb-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.562272 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-client-ca\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.562967 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-config\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.566317 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71f913-19f1-4c11-b85c-4a4471dff4df-serving-cert\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.579873 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-zswsr\" (UniqueName: \"kubernetes.io/projected/bf71f913-19f1-4c11-b85c-4a4471dff4df-kube-api-access-zswsr\") pod \"route-controller-manager-55c4986d76-2wbsp\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.629674 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.660975 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kube-api-access\") pod \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.661062 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kubelet-dir\") pod \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\" (UID: \"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7\") " Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.661147 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "600ff6fe-2bd7-485b-851a-5b3f9b1c37c7" (UID: "600ff6fe-2bd7-485b-851a-5b3f9b1c37c7"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.661327 4812 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.671564 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "600ff6fe-2bd7-485b-851a-5b3f9b1c37c7" (UID: "600ff6fe-2bd7-485b-851a-5b3f9b1c37c7"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.717514 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.762401 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/600ff6fe-2bd7-485b-851a-5b3f9b1c37c7-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.769769 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr"] Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.775396 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-f7d9f84f5-ls8vr"] Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.781085 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85"] Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.781966 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-7fb457f777-k4d85"] Jan 31 04:29:54 crc kubenswrapper[4812]: I0131 04:29:54.908111 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp"] Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.444463 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" event={"ID":"bf71f913-19f1-4c11-b85c-4a4471dff4df","Type":"ContainerStarted","Data":"eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7"} Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.444919 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" event={"ID":"bf71f913-19f1-4c11-b85c-4a4471dff4df","Type":"ContainerStarted","Data":"7ec7452591619ae796caa1b45eacab3e7417dc31e391c9fc9ca8fe1f83da0f41"} Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.448886 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.465086 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/revision-pruner-9-crc" event={"ID":"600ff6fe-2bd7-485b-851a-5b3f9b1c37c7","Type":"ContainerDied","Data":"45bfd5a2cf45e0cff412fdeec21b7b5be6b21afad27109c142bf365d7ef98545"} Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.465150 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45bfd5a2cf45e0cff412fdeec21b7b5be6b21afad27109c142bf365d7ef98545" Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.466527 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/revision-pruner-9-crc" Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.474527 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" podStartSLOduration=18.474507317 podStartE2EDuration="18.474507317s" podCreationTimestamp="2026-01-31 04:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:55.469900789 +0000 UTC m=+203.964922464" watchObservedRunningTime="2026-01-31 04:29:55.474507317 +0000 UTC m=+203.969528992" Jan 31 04:29:55 crc kubenswrapper[4812]: I0131 04:29:55.548918 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.348563 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="22b629d2-4035-41be-9ef1-bb74508929eb" path="/var/lib/kubelet/pods/22b629d2-4035-41be-9ef1-bb74508929eb/volumes" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.355213 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fc61f916-6212-4fb9-9a7b-25dd3dd61dfa" path="/var/lib/kubelet/pods/fc61f916-6212-4fb9-9a7b-25dd3dd61dfa/volumes" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.957259 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-ffc7946bc-5w92b"] Jan 31 04:29:56 crc kubenswrapper[4812]: E0131 04:29:56.957603 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="600ff6fe-2bd7-485b-851a-5b3f9b1c37c7" containerName="pruner" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.957624 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="600ff6fe-2bd7-485b-851a-5b3f9b1c37c7" containerName="pruner" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.957824 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="600ff6fe-2bd7-485b-851a-5b3f9b1c37c7" containerName="pruner" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.958393 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.965893 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ffc7946bc-5w92b"] Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.970048 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.970259 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.970291 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.970948 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.971164 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.974026 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.978238 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.989713 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-proxy-ca-bundles\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.989756 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-config\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.989778 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-svlp4\" (UniqueName: \"kubernetes.io/projected/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-kube-api-access-svlp4\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.989800 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-serving-cert\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:56 crc kubenswrapper[4812]: I0131 04:29:56.989813 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-client-ca\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.090405 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-serving-cert\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.090594 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-client-ca\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.090684 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-proxy-ca-bundles\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.090712 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-config\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.090734 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-svlp4\" (UniqueName: \"kubernetes.io/projected/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-kube-api-access-svlp4\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.092551 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-client-ca\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.092969 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-proxy-ca-bundles\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.094784 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-config\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.099632 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-serving-cert\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.108217 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-svlp4\" (UniqueName: \"kubernetes.io/projected/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-kube-api-access-svlp4\") pod \"controller-manager-ffc7946bc-5w92b\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.277185 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:57 crc kubenswrapper[4812]: I0131 04:29:57.543341 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-ffc7946bc-5w92b"] Jan 31 04:29:57 crc kubenswrapper[4812]: W0131 04:29:57.557012 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4254ca3f_b888_4f7b_b5e3_c006979f0a8d.slice/crio-8ac0c63380512fffb2fd5b8f7b3a3931c86870df812ebbe70eb1338855fa3419 WatchSource:0}: Error finding container 8ac0c63380512fffb2fd5b8f7b3a3931c86870df812ebbe70eb1338855fa3419: Status 404 returned error can't find the container with id 8ac0c63380512fffb2fd5b8f7b3a3931c86870df812ebbe70eb1338855fa3419 Jan 31 04:29:58 crc kubenswrapper[4812]: I0131 04:29:58.485889 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" event={"ID":"4254ca3f-b888-4f7b-b5e3-c006979f0a8d","Type":"ContainerStarted","Data":"02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80"} Jan 31 04:29:58 crc kubenswrapper[4812]: I0131 04:29:58.487915 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" event={"ID":"4254ca3f-b888-4f7b-b5e3-c006979f0a8d","Type":"ContainerStarted","Data":"8ac0c63380512fffb2fd5b8f7b3a3931c86870df812ebbe70eb1338855fa3419"} Jan 31 04:29:58 crc kubenswrapper[4812]: I0131 04:29:58.487978 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:58 crc kubenswrapper[4812]: I0131 04:29:58.504154 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:29:58 crc kubenswrapper[4812]: I0131 04:29:58.525658 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" podStartSLOduration=21.525614612 podStartE2EDuration="21.525614612s" podCreationTimestamp="2026-01-31 04:29:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:29:58.521138528 +0000 UTC m=+207.016160223" watchObservedRunningTime="2026-01-31 04:29:58.525614612 +0000 UTC m=+207.020636277" Jan 31 04:29:59 crc kubenswrapper[4812]: I0131 04:29:59.493486 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerStarted","Data":"47e96244ef77dbeb464d7222f0578420d8e6732378ff529062ef6fa301c54ac8"} Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.151681 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6"] Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.153491 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.155433 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.155882 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.171995 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6"] Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.240797 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ebef5-321d-489c-a63c-c32ac288fcac-config-volume\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.240927 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fkwss\" (UniqueName: \"kubernetes.io/projected/972ebef5-321d-489c-a63c-c32ac288fcac-kube-api-access-fkwss\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.241091 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ebef5-321d-489c-a63c-c32ac288fcac-secret-volume\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.342118 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fkwss\" (UniqueName: \"kubernetes.io/projected/972ebef5-321d-489c-a63c-c32ac288fcac-kube-api-access-fkwss\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.342526 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ebef5-321d-489c-a63c-c32ac288fcac-secret-volume\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.342734 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ebef5-321d-489c-a63c-c32ac288fcac-config-volume\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.345047 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ebef5-321d-489c-a63c-c32ac288fcac-config-volume\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.356726 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ebef5-321d-489c-a63c-c32ac288fcac-secret-volume\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.374692 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fkwss\" (UniqueName: \"kubernetes.io/projected/972ebef5-321d-489c-a63c-c32ac288fcac-kube-api-access-fkwss\") pod \"collect-profiles-29497230-d24b6\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.473508 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.507881 4812 generic.go:334] "Generic (PLEG): container finished" podID="e97d5bed-3c11-40af-869e-53c260650edb" containerID="47e96244ef77dbeb464d7222f0578420d8e6732378ff529062ef6fa301c54ac8" exitCode=0 Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.507995 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerDied","Data":"47e96244ef77dbeb464d7222f0578420d8e6732378ff529062ef6fa301c54ac8"} Jan 31 04:30:00 crc kubenswrapper[4812]: I0131 04:30:00.879627 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6"] Jan 31 04:30:01 crc kubenswrapper[4812]: I0131 04:30:01.516451 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerStarted","Data":"51bde5c7b35f26bbfacc435433b9b7f8c47a4910ac941a9b353abbf05dcd5084"} Jan 31 04:30:01 crc kubenswrapper[4812]: I0131 04:30:01.519168 4812 generic.go:334] "Generic (PLEG): container finished" podID="972ebef5-321d-489c-a63c-c32ac288fcac" containerID="c5c4a959a92ba2948d8c791f0e4a38aa75f445c59a07e30d8befe81f4a544887" exitCode=0 Jan 31 04:30:01 crc kubenswrapper[4812]: I0131 04:30:01.519233 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" event={"ID":"972ebef5-321d-489c-a63c-c32ac288fcac","Type":"ContainerDied","Data":"c5c4a959a92ba2948d8c791f0e4a38aa75f445c59a07e30d8befe81f4a544887"} Jan 31 04:30:01 crc kubenswrapper[4812]: I0131 04:30:01.519272 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" event={"ID":"972ebef5-321d-489c-a63c-c32ac288fcac","Type":"ContainerStarted","Data":"b551a76581381ada00049e4b664a525e62a2b36440428f5f5db8ad9c217361e2"} Jan 31 04:30:01 crc kubenswrapper[4812]: I0131 04:30:01.537497 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-2z5tv" podStartSLOduration=3.481850585 podStartE2EDuration="1m2.53747543s" podCreationTimestamp="2026-01-31 04:28:59 +0000 UTC" firstStartedPulling="2026-01-31 04:29:01.881364628 +0000 UTC m=+150.376386293" lastFinishedPulling="2026-01-31 04:30:00.936989433 +0000 UTC m=+209.432011138" observedRunningTime="2026-01-31 04:30:01.53241607 +0000 UTC m=+210.027437775" watchObservedRunningTime="2026-01-31 04:30:01.53747543 +0000 UTC m=+210.032497135" Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.070265 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-57sr9"] Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.832779 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.977879 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ebef5-321d-489c-a63c-c32ac288fcac-secret-volume\") pod \"972ebef5-321d-489c-a63c-c32ac288fcac\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.978181 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ebef5-321d-489c-a63c-c32ac288fcac-config-volume\") pod \"972ebef5-321d-489c-a63c-c32ac288fcac\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.978301 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fkwss\" (UniqueName: \"kubernetes.io/projected/972ebef5-321d-489c-a63c-c32ac288fcac-kube-api-access-fkwss\") pod \"972ebef5-321d-489c-a63c-c32ac288fcac\" (UID: \"972ebef5-321d-489c-a63c-c32ac288fcac\") " Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.978569 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/972ebef5-321d-489c-a63c-c32ac288fcac-config-volume" (OuterVolumeSpecName: "config-volume") pod "972ebef5-321d-489c-a63c-c32ac288fcac" (UID: "972ebef5-321d-489c-a63c-c32ac288fcac"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.978689 4812 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/972ebef5-321d-489c-a63c-c32ac288fcac-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.983577 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/972ebef5-321d-489c-a63c-c32ac288fcac-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "972ebef5-321d-489c-a63c-c32ac288fcac" (UID: "972ebef5-321d-489c-a63c-c32ac288fcac"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:02 crc kubenswrapper[4812]: I0131 04:30:02.987390 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/972ebef5-321d-489c-a63c-c32ac288fcac-kube-api-access-fkwss" (OuterVolumeSpecName: "kube-api-access-fkwss") pod "972ebef5-321d-489c-a63c-c32ac288fcac" (UID: "972ebef5-321d-489c-a63c-c32ac288fcac"). InnerVolumeSpecName "kube-api-access-fkwss". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:03 crc kubenswrapper[4812]: I0131 04:30:03.080451 4812 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/972ebef5-321d-489c-a63c-c32ac288fcac-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:03 crc kubenswrapper[4812]: I0131 04:30:03.080507 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fkwss\" (UniqueName: \"kubernetes.io/projected/972ebef5-321d-489c-a63c-c32ac288fcac-kube-api-access-fkwss\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:03 crc kubenswrapper[4812]: I0131 04:30:03.541943 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" event={"ID":"972ebef5-321d-489c-a63c-c32ac288fcac","Type":"ContainerDied","Data":"b551a76581381ada00049e4b664a525e62a2b36440428f5f5db8ad9c217361e2"} Jan 31 04:30:03 crc kubenswrapper[4812]: I0131 04:30:03.542294 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b551a76581381ada00049e4b664a525e62a2b36440428f5f5db8ad9c217361e2" Jan 31 04:30:03 crc kubenswrapper[4812]: I0131 04:30:03.542066 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497230-d24b6" Jan 31 04:30:04 crc kubenswrapper[4812]: I0131 04:30:04.548245 4812 generic.go:334] "Generic (PLEG): container finished" podID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerID="b2053bfd5d695201d9bffcdc4c6f27913c5f37b27304a9fefaa61bdbfc9e299f" exitCode=0 Jan 31 04:30:04 crc kubenswrapper[4812]: I0131 04:30:04.548520 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94kb7" event={"ID":"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441","Type":"ContainerDied","Data":"b2053bfd5d695201d9bffcdc4c6f27913c5f37b27304a9fefaa61bdbfc9e299f"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.568749 4812 generic.go:334] "Generic (PLEG): container finished" podID="a16a82f9-4289-4749-bc62-df59dacefac1" containerID="8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811" exitCode=0 Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.569392 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56x2d" event={"ID":"a16a82f9-4289-4749-bc62-df59dacefac1","Type":"ContainerDied","Data":"8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.573166 4812 generic.go:334] "Generic (PLEG): container finished" podID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerID="4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d" exitCode=0 Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.573247 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtdpt" event={"ID":"b369d585-140d-46fb-8b27-42f6fdc8817a","Type":"ContainerDied","Data":"4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.577631 4812 generic.go:334] "Generic (PLEG): container finished" podID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerID="3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa" exitCode=0 Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.577690 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfclc" event={"ID":"d6e79cce-8b4e-491b-a976-a3649e3566cd","Type":"ContainerDied","Data":"3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.581203 4812 generic.go:334] "Generic (PLEG): container finished" podID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerID="0719fc29a76ad28fa8ea510f5749e3ecaf212ffb2f213287312eb8f45b5eb3f5" exitCode=0 Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.581270 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sktpt" event={"ID":"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6","Type":"ContainerDied","Data":"0719fc29a76ad28fa8ea510f5749e3ecaf212ffb2f213287312eb8f45b5eb3f5"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.587867 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94kb7" event={"ID":"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441","Type":"ContainerStarted","Data":"c6883ef1a1eeee16e4ad80d31f5bfc49ef641b816b999dfd58cbdee3de121392"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.589589 4812 generic.go:334] "Generic (PLEG): container finished" podID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerID="e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da" exitCode=0 Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.589632 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487ln" event={"ID":"4dd847bf-95c7-48c4-9042-f078db7c8438","Type":"ContainerDied","Data":"e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.593676 4812 generic.go:334] "Generic (PLEG): container finished" podID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerID="78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118" exitCode=0 Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.593736 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sj99w" event={"ID":"d9c1a0d3-b881-4382-89c4-905ad455a360","Type":"ContainerDied","Data":"78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118"} Jan 31 04:30:08 crc kubenswrapper[4812]: I0131 04:30:08.664722 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-94kb7" podStartSLOduration=4.243252036 podStartE2EDuration="1m9.664702877s" podCreationTimestamp="2026-01-31 04:28:59 +0000 UTC" firstStartedPulling="2026-01-31 04:29:01.876937358 +0000 UTC m=+150.371959023" lastFinishedPulling="2026-01-31 04:30:07.298388199 +0000 UTC m=+215.793409864" observedRunningTime="2026-01-31 04:30:08.6586514 +0000 UTC m=+217.153673095" watchObservedRunningTime="2026-01-31 04:30:08.664702877 +0000 UTC m=+217.159724542" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.612886 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfclc" event={"ID":"d6e79cce-8b4e-491b-a976-a3649e3566cd","Type":"ContainerStarted","Data":"6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c"} Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.616695 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sktpt" event={"ID":"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6","Type":"ContainerStarted","Data":"4d3a725e3d8169eff4a889017e23a84eae2b00bd4f2c6e88959e05059c69d1f3"} Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.620304 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sj99w" event={"ID":"d9c1a0d3-b881-4382-89c4-905ad455a360","Type":"ContainerStarted","Data":"a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5"} Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.622299 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56x2d" event={"ID":"a16a82f9-4289-4749-bc62-df59dacefac1","Type":"ContainerStarted","Data":"517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673"} Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.624062 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtdpt" event={"ID":"b369d585-140d-46fb-8b27-42f6fdc8817a","Type":"ContainerStarted","Data":"f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e"} Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.636884 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-bfclc" podStartSLOduration=2.139231034 podStartE2EDuration="1m8.636865725s" podCreationTimestamp="2026-01-31 04:29:01 +0000 UTC" firstStartedPulling="2026-01-31 04:29:02.940378894 +0000 UTC m=+151.435400559" lastFinishedPulling="2026-01-31 04:30:09.438013585 +0000 UTC m=+217.933035250" observedRunningTime="2026-01-31 04:30:09.635171647 +0000 UTC m=+218.130193312" watchObservedRunningTime="2026-01-31 04:30:09.636865725 +0000 UTC m=+218.131887390" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.664210 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-56x2d" podStartSLOduration=3.274682002 podStartE2EDuration="1m10.66419556s" podCreationTimestamp="2026-01-31 04:28:59 +0000 UTC" firstStartedPulling="2026-01-31 04:29:01.866207915 +0000 UTC m=+150.361229580" lastFinishedPulling="2026-01-31 04:30:09.255721473 +0000 UTC m=+217.750743138" observedRunningTime="2026-01-31 04:30:09.661118195 +0000 UTC m=+218.156139860" watchObservedRunningTime="2026-01-31 04:30:09.66419556 +0000 UTC m=+218.159217225" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.691243 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-jtdpt" podStartSLOduration=3.336489289 podStartE2EDuration="1m7.691228827s" podCreationTimestamp="2026-01-31 04:29:02 +0000 UTC" firstStartedPulling="2026-01-31 04:29:05.001753442 +0000 UTC m=+153.496775107" lastFinishedPulling="2026-01-31 04:30:09.35649299 +0000 UTC m=+217.851514645" observedRunningTime="2026-01-31 04:30:09.690318323 +0000 UTC m=+218.185339988" watchObservedRunningTime="2026-01-31 04:30:09.691228827 +0000 UTC m=+218.186250492" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.710876 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-sj99w" podStartSLOduration=2.539355346 podStartE2EDuration="1m10.71085907s" podCreationTimestamp="2026-01-31 04:28:59 +0000 UTC" firstStartedPulling="2026-01-31 04:29:00.859759931 +0000 UTC m=+149.354781586" lastFinishedPulling="2026-01-31 04:30:09.031263645 +0000 UTC m=+217.526285310" observedRunningTime="2026-01-31 04:30:09.707767435 +0000 UTC m=+218.202789100" watchObservedRunningTime="2026-01-31 04:30:09.71085907 +0000 UTC m=+218.205880735" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.727993 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-sktpt" podStartSLOduration=2.680265938 podStartE2EDuration="1m8.727975094s" podCreationTimestamp="2026-01-31 04:29:01 +0000 UTC" firstStartedPulling="2026-01-31 04:29:02.939992034 +0000 UTC m=+151.435013699" lastFinishedPulling="2026-01-31 04:30:08.98770119 +0000 UTC m=+217.482722855" observedRunningTime="2026-01-31 04:30:09.727567352 +0000 UTC m=+218.222589017" watchObservedRunningTime="2026-01-31 04:30:09.727975094 +0000 UTC m=+218.222996759" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.739065 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.739129 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.910896 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:30:09 crc kubenswrapper[4812]: I0131 04:30:09.911373 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.136412 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.136697 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.182516 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.378223 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.378569 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.420119 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.633047 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487ln" event={"ID":"4dd847bf-95c7-48c4-9042-f078db7c8438","Type":"ContainerStarted","Data":"ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e"} Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.654168 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-487ln" podStartSLOduration=2.968201324 podStartE2EDuration="1m8.654150899s" podCreationTimestamp="2026-01-31 04:29:02 +0000 UTC" firstStartedPulling="2026-01-31 04:29:03.951760162 +0000 UTC m=+152.446781827" lastFinishedPulling="2026-01-31 04:30:09.637709737 +0000 UTC m=+218.132731402" observedRunningTime="2026-01-31 04:30:10.653078319 +0000 UTC m=+219.148099984" watchObservedRunningTime="2026-01-31 04:30:10.654150899 +0000 UTC m=+219.149172564" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.673713 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.912690 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/certified-operators-sj99w" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="registry-server" probeResult="failure" output=< Jan 31 04:30:10 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:30:10 crc kubenswrapper[4812]: > Jan 31 04:30:10 crc kubenswrapper[4812]: I0131 04:30:10.950040 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-56x2d" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="registry-server" probeResult="failure" output=< Jan 31 04:30:10 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:30:10 crc kubenswrapper[4812]: > Jan 31 04:30:11 crc kubenswrapper[4812]: I0131 04:30:11.908859 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:30:11 crc kubenswrapper[4812]: I0131 04:30:11.908927 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:30:11 crc kubenswrapper[4812]: I0131 04:30:11.952580 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:30:12 crc kubenswrapper[4812]: I0131 04:30:12.325868 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z5tv"] Jan 31 04:30:12 crc kubenswrapper[4812]: I0131 04:30:12.352411 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:30:12 crc kubenswrapper[4812]: I0131 04:30:12.352475 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:30:12 crc kubenswrapper[4812]: I0131 04:30:12.395856 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:30:12 crc kubenswrapper[4812]: I0131 04:30:12.642949 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-2z5tv" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="registry-server" containerID="cri-o://51bde5c7b35f26bbfacc435433b9b7f8c47a4910ac941a9b353abbf05dcd5084" gracePeriod=2 Jan 31 04:30:13 crc kubenswrapper[4812]: I0131 04:30:13.048638 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:30:13 crc kubenswrapper[4812]: I0131 04:30:13.049053 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:30:13 crc kubenswrapper[4812]: I0131 04:30:13.327262 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:30:13 crc kubenswrapper[4812]: I0131 04:30:13.327316 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:30:13 crc kubenswrapper[4812]: I0131 04:30:13.651422 4812 generic.go:334] "Generic (PLEG): container finished" podID="e97d5bed-3c11-40af-869e-53c260650edb" containerID="51bde5c7b35f26bbfacc435433b9b7f8c47a4910ac941a9b353abbf05dcd5084" exitCode=0 Jan 31 04:30:13 crc kubenswrapper[4812]: I0131 04:30:13.651474 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerDied","Data":"51bde5c7b35f26bbfacc435433b9b7f8c47a4910ac941a9b353abbf05dcd5084"} Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.127776 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-487ln" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="registry-server" probeResult="failure" output=< Jan 31 04:30:14 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:30:14 crc kubenswrapper[4812]: > Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.223604 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.338203 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.338252 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.338288 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.338688 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.338736 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de" gracePeriod=600 Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.380539 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-jtdpt" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="registry-server" probeResult="failure" output=< Jan 31 04:30:14 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:30:14 crc kubenswrapper[4812]: > Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.402566 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-utilities\") pod \"e97d5bed-3c11-40af-869e-53c260650edb\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.402666 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-catalog-content\") pod \"e97d5bed-3c11-40af-869e-53c260650edb\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.402731 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h8dn5\" (UniqueName: \"kubernetes.io/projected/e97d5bed-3c11-40af-869e-53c260650edb-kube-api-access-h8dn5\") pod \"e97d5bed-3c11-40af-869e-53c260650edb\" (UID: \"e97d5bed-3c11-40af-869e-53c260650edb\") " Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.403877 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-utilities" (OuterVolumeSpecName: "utilities") pod "e97d5bed-3c11-40af-869e-53c260650edb" (UID: "e97d5bed-3c11-40af-869e-53c260650edb"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.419813 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e97d5bed-3c11-40af-869e-53c260650edb-kube-api-access-h8dn5" (OuterVolumeSpecName: "kube-api-access-h8dn5") pod "e97d5bed-3c11-40af-869e-53c260650edb" (UID: "e97d5bed-3c11-40af-869e-53c260650edb"). InnerVolumeSpecName "kube-api-access-h8dn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.456277 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e97d5bed-3c11-40af-869e-53c260650edb" (UID: "e97d5bed-3c11-40af-869e-53c260650edb"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.503786 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.503821 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e97d5bed-3c11-40af-869e-53c260650edb-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.503832 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h8dn5\" (UniqueName: \"kubernetes.io/projected/e97d5bed-3c11-40af-869e-53c260650edb-kube-api-access-h8dn5\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.659911 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-2z5tv" event={"ID":"e97d5bed-3c11-40af-869e-53c260650edb","Type":"ContainerDied","Data":"a8a6d702db9cf37fd534778f4cb6a300c57b5102151186b3e73f9cba2b4731c8"} Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.659965 4812 scope.go:117] "RemoveContainer" containerID="51bde5c7b35f26bbfacc435433b9b7f8c47a4910ac941a9b353abbf05dcd5084" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.660006 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-2z5tv" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.679061 4812 scope.go:117] "RemoveContainer" containerID="47e96244ef77dbeb464d7222f0578420d8e6732378ff529062ef6fa301c54ac8" Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.694394 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-2z5tv"] Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.699002 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-2z5tv"] Jan 31 04:30:14 crc kubenswrapper[4812]: I0131 04:30:14.708149 4812 scope.go:117] "RemoveContainer" containerID="f2b32469863c3a03f5bb64d42c151a2c1fe58560d028c1e25b3fb4bf79055b6f" Jan 31 04:30:15 crc kubenswrapper[4812]: I0131 04:30:15.667898 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de" exitCode=0 Jan 31 04:30:15 crc kubenswrapper[4812]: I0131 04:30:15.667995 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de"} Jan 31 04:30:16 crc kubenswrapper[4812]: I0131 04:30:16.347108 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e97d5bed-3c11-40af-869e-53c260650edb" path="/var/lib/kubelet/pods/e97d5bed-3c11-40af-869e-53c260650edb/volumes" Jan 31 04:30:16 crc kubenswrapper[4812]: I0131 04:30:16.687290 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"a3e331ce48921e7625aeb93566be87bb5941cdb39529818989611c844d2ef521"} Jan 31 04:30:17 crc kubenswrapper[4812]: I0131 04:30:17.393770 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-ffc7946bc-5w92b"] Jan 31 04:30:17 crc kubenswrapper[4812]: I0131 04:30:17.394106 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" podUID="4254ca3f-b888-4f7b-b5e3-c006979f0a8d" containerName="controller-manager" containerID="cri-o://02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80" gracePeriod=30 Jan 31 04:30:17 crc kubenswrapper[4812]: I0131 04:30:17.476585 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp"] Jan 31 04:30:17 crc kubenswrapper[4812]: I0131 04:30:17.476819 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" podUID="bf71f913-19f1-4c11-b85c-4a4471dff4df" containerName="route-controller-manager" containerID="cri-o://eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7" gracePeriod=30 Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.562590 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.569649 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.607233 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x"] Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.607701 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="registry-server" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.607745 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="registry-server" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.607778 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="972ebef5-321d-489c-a63c-c32ac288fcac" containerName="collect-profiles" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.607795 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="972ebef5-321d-489c-a63c-c32ac288fcac" containerName="collect-profiles" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.607822 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4254ca3f-b888-4f7b-b5e3-c006979f0a8d" containerName="controller-manager" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.607886 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4254ca3f-b888-4f7b-b5e3-c006979f0a8d" containerName="controller-manager" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.607914 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="extract-utilities" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.607932 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="extract-utilities" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.607960 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf71f913-19f1-4c11-b85c-4a4471dff4df" containerName="route-controller-manager" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.607997 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf71f913-19f1-4c11-b85c-4a4471dff4df" containerName="route-controller-manager" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.608048 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="extract-content" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.608066 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="extract-content" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.608283 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4254ca3f-b888-4f7b-b5e3-c006979f0a8d" containerName="controller-manager" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.608314 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="e97d5bed-3c11-40af-869e-53c260650edb" containerName="registry-server" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.608372 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="972ebef5-321d-489c-a63c-c32ac288fcac" containerName="collect-profiles" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.608412 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf71f913-19f1-4c11-b85c-4a4471dff4df" containerName="route-controller-manager" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.609171 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.617919 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x"] Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.699791 4812 generic.go:334] "Generic (PLEG): container finished" podID="bf71f913-19f1-4c11-b85c-4a4471dff4df" containerID="eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7" exitCode=0 Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.699881 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" event={"ID":"bf71f913-19f1-4c11-b85c-4a4471dff4df","Type":"ContainerDied","Data":"eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7"} Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.699909 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" event={"ID":"bf71f913-19f1-4c11-b85c-4a4471dff4df","Type":"ContainerDied","Data":"7ec7452591619ae796caa1b45eacab3e7417dc31e391c9fc9ca8fe1f83da0f41"} Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.699926 4812 scope.go:117] "RemoveContainer" containerID="eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.700012 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.702152 4812 generic.go:334] "Generic (PLEG): container finished" podID="4254ca3f-b888-4f7b-b5e3-c006979f0a8d" containerID="02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80" exitCode=0 Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.702171 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" event={"ID":"4254ca3f-b888-4f7b-b5e3-c006979f0a8d","Type":"ContainerDied","Data":"02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80"} Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.702184 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" event={"ID":"4254ca3f-b888-4f7b-b5e3-c006979f0a8d","Type":"ContainerDied","Data":"8ac0c63380512fffb2fd5b8f7b3a3931c86870df812ebbe70eb1338855fa3419"} Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.702217 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-ffc7946bc-5w92b" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.717151 4812 scope.go:117] "RemoveContainer" containerID="eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.717431 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7\": container with ID starting with eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7 not found: ID does not exist" containerID="eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.717464 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7"} err="failed to get container status \"eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7\": rpc error: code = NotFound desc = could not find container \"eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7\": container with ID starting with eae952baed21ed40f80a9a7094f43cbc40ab360e45b6bad7087dee384b8738f7 not found: ID does not exist" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.717485 4812 scope.go:117] "RemoveContainer" containerID="02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.729853 4812 scope.go:117] "RemoveContainer" containerID="02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80" Jan 31 04:30:18 crc kubenswrapper[4812]: E0131 04:30:18.730125 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80\": container with ID starting with 02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80 not found: ID does not exist" containerID="02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.730175 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80"} err="failed to get container status \"02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80\": rpc error: code = NotFound desc = could not find container \"02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80\": container with ID starting with 02776b149d5557f983d2b2a9ccef937d66463d9cc24c53985c6b7bc6308a1a80 not found: ID does not exist" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.763946 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-svlp4\" (UniqueName: \"kubernetes.io/projected/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-kube-api-access-svlp4\") pod \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.763985 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zswsr\" (UniqueName: \"kubernetes.io/projected/bf71f913-19f1-4c11-b85c-4a4471dff4df-kube-api-access-zswsr\") pod \"bf71f913-19f1-4c11-b85c-4a4471dff4df\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764004 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-proxy-ca-bundles\") pod \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764052 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-client-ca\") pod \"bf71f913-19f1-4c11-b85c-4a4471dff4df\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764072 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-config\") pod \"bf71f913-19f1-4c11-b85c-4a4471dff4df\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764093 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-client-ca\") pod \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764124 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71f913-19f1-4c11-b85c-4a4471dff4df-serving-cert\") pod \"bf71f913-19f1-4c11-b85c-4a4471dff4df\" (UID: \"bf71f913-19f1-4c11-b85c-4a4471dff4df\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764141 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-serving-cert\") pod \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764200 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-config\") pod \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\" (UID: \"4254ca3f-b888-4f7b-b5e3-c006979f0a8d\") " Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764370 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6ctw\" (UniqueName: \"kubernetes.io/projected/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-kube-api-access-d6ctw\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764427 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-config\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764467 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-client-ca\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.764485 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-serving-cert\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.765957 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "4254ca3f-b888-4f7b-b5e3-c006979f0a8d" (UID: "4254ca3f-b888-4f7b-b5e3-c006979f0a8d"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.767088 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-client-ca" (OuterVolumeSpecName: "client-ca") pod "bf71f913-19f1-4c11-b85c-4a4471dff4df" (UID: "bf71f913-19f1-4c11-b85c-4a4471dff4df"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.767290 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-client-ca" (OuterVolumeSpecName: "client-ca") pod "4254ca3f-b888-4f7b-b5e3-c006979f0a8d" (UID: "4254ca3f-b888-4f7b-b5e3-c006979f0a8d"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.767237 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-config" (OuterVolumeSpecName: "config") pod "bf71f913-19f1-4c11-b85c-4a4471dff4df" (UID: "bf71f913-19f1-4c11-b85c-4a4471dff4df"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.767798 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-config" (OuterVolumeSpecName: "config") pod "4254ca3f-b888-4f7b-b5e3-c006979f0a8d" (UID: "4254ca3f-b888-4f7b-b5e3-c006979f0a8d"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.771184 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "4254ca3f-b888-4f7b-b5e3-c006979f0a8d" (UID: "4254ca3f-b888-4f7b-b5e3-c006979f0a8d"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.771339 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bf71f913-19f1-4c11-b85c-4a4471dff4df-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "bf71f913-19f1-4c11-b85c-4a4471dff4df" (UID: "bf71f913-19f1-4c11-b85c-4a4471dff4df"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.777197 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf71f913-19f1-4c11-b85c-4a4471dff4df-kube-api-access-zswsr" (OuterVolumeSpecName: "kube-api-access-zswsr") pod "bf71f913-19f1-4c11-b85c-4a4471dff4df" (UID: "bf71f913-19f1-4c11-b85c-4a4471dff4df"). InnerVolumeSpecName "kube-api-access-zswsr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.778033 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-kube-api-access-svlp4" (OuterVolumeSpecName: "kube-api-access-svlp4") pod "4254ca3f-b888-4f7b-b5e3-c006979f0a8d" (UID: "4254ca3f-b888-4f7b-b5e3-c006979f0a8d"). InnerVolumeSpecName "kube-api-access-svlp4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866032 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d6ctw\" (UniqueName: \"kubernetes.io/projected/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-kube-api-access-d6ctw\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866120 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-config\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866209 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-client-ca\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866246 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-serving-cert\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866364 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866388 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-svlp4\" (UniqueName: \"kubernetes.io/projected/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-kube-api-access-svlp4\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866411 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-zswsr\" (UniqueName: \"kubernetes.io/projected/bf71f913-19f1-4c11-b85c-4a4471dff4df-kube-api-access-zswsr\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866429 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866448 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866468 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bf71f913-19f1-4c11-b85c-4a4471dff4df-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866485 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866503 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bf71f913-19f1-4c11-b85c-4a4471dff4df-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.866520 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/4254ca3f-b888-4f7b-b5e3-c006979f0a8d-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.868443 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-client-ca\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.871042 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-config\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.872144 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-serving-cert\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.883425 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6ctw\" (UniqueName: \"kubernetes.io/projected/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-kube-api-access-d6ctw\") pod \"route-controller-manager-c59479947-k6l6x\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:18 crc kubenswrapper[4812]: I0131 04:30:18.934729 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.047423 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-ffc7946bc-5w92b"] Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.049536 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-ffc7946bc-5w92b"] Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.067371 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp"] Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.070986 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-55c4986d76-2wbsp"] Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.438769 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x"] Jan 31 04:30:19 crc kubenswrapper[4812]: W0131 04:30:19.445968 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf87f95aa_2fc9_4c0d_982c_e37190ae86cc.slice/crio-79147bd82f97aa612f434c5631dbe2b8c4e8662f15e8a60a965f58ad7c311787 WatchSource:0}: Error finding container 79147bd82f97aa612f434c5631dbe2b8c4e8662f15e8a60a965f58ad7c311787: Status 404 returned error can't find the container with id 79147bd82f97aa612f434c5631dbe2b8c4e8662f15e8a60a965f58ad7c311787 Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.714683 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" event={"ID":"f87f95aa-2fc9-4c0d-982c-e37190ae86cc","Type":"ContainerStarted","Data":"79147bd82f97aa612f434c5631dbe2b8c4e8662f15e8a60a965f58ad7c311787"} Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.810316 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.896870 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:30:19 crc kubenswrapper[4812]: I0131 04:30:19.970820 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.011903 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.214312 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.352899 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4254ca3f-b888-4f7b-b5e3-c006979f0a8d" path="/var/lib/kubelet/pods/4254ca3f-b888-4f7b-b5e3-c006979f0a8d/volumes" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.354966 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf71f913-19f1-4c11-b85c-4a4471dff4df" path="/var/lib/kubelet/pods/bf71f913-19f1-4c11-b85c-4a4471dff4df/volumes" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.731368 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" event={"ID":"f87f95aa-2fc9-4c0d-982c-e37190ae86cc","Type":"ContainerStarted","Data":"7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5"} Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.769903 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" podStartSLOduration=3.769872549 podStartE2EDuration="3.769872549s" podCreationTimestamp="2026-01-31 04:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:30:20.763602385 +0000 UTC m=+229.258624080" watchObservedRunningTime="2026-01-31 04:30:20.769872549 +0000 UTC m=+229.264894244" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.965372 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z"] Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.966719 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.971043 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.974742 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.975053 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.975716 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.976021 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.977743 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.984566 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z"] Jan 31 04:30:20 crc kubenswrapper[4812]: I0131 04:30:20.990828 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.092978 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-proxy-ca-bundles\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.093063 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-client-ca\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.093123 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b59154e-f9e6-475d-9fc7-77af2533a402-serving-cert\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.093280 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rnj72\" (UniqueName: \"kubernetes.io/projected/9b59154e-f9e6-475d-9fc7-77af2533a402-kube-api-access-rnj72\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.093330 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-config\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.194979 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rnj72\" (UniqueName: \"kubernetes.io/projected/9b59154e-f9e6-475d-9fc7-77af2533a402-kube-api-access-rnj72\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.195362 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-config\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.195655 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-proxy-ca-bundles\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.195982 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-client-ca\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.196223 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b59154e-f9e6-475d-9fc7-77af2533a402-serving-cert\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.197053 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-config\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.197158 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-client-ca\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.198088 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-proxy-ca-bundles\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.204744 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b59154e-f9e6-475d-9fc7-77af2533a402-serving-cert\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.215173 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rnj72\" (UniqueName: \"kubernetes.io/projected/9b59154e-f9e6-475d-9fc7-77af2533a402-kube-api-access-rnj72\") pod \"controller-manager-69cf84b9dc-tkf9z\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.292794 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.576259 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z"] Jan 31 04:30:21 crc kubenswrapper[4812]: W0131 04:30:21.581882 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b59154e_f9e6_475d_9fc7_77af2533a402.slice/crio-62b49f3ca5b10fa341cee379cbc55cd6d1dc4cb2e1ff4f290b08681276196184 WatchSource:0}: Error finding container 62b49f3ca5b10fa341cee379cbc55cd6d1dc4cb2e1ff4f290b08681276196184: Status 404 returned error can't find the container with id 62b49f3ca5b10fa341cee379cbc55cd6d1dc4cb2e1ff4f290b08681276196184 Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.740120 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" event={"ID":"9b59154e-f9e6-475d-9fc7-77af2533a402","Type":"ContainerStarted","Data":"62b49f3ca5b10fa341cee379cbc55cd6d1dc4cb2e1ff4f290b08681276196184"} Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.740426 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.748905 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:30:21 crc kubenswrapper[4812]: I0131 04:30:21.967035 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:30:22 crc kubenswrapper[4812]: I0131 04:30:22.437790 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:30:22 crc kubenswrapper[4812]: I0131 04:30:22.453536 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94kb7"] Jan 31 04:30:22 crc kubenswrapper[4812]: I0131 04:30:22.456419 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-94kb7" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="registry-server" containerID="cri-o://c6883ef1a1eeee16e4ad80d31f5bfc49ef641b816b999dfd58cbdee3de121392" gracePeriod=2 Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.115408 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.185951 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.401257 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.469469 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.761344 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" event={"ID":"9b59154e-f9e6-475d-9fc7-77af2533a402","Type":"ContainerStarted","Data":"3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55"} Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.761689 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.765365 4812 generic.go:334] "Generic (PLEG): container finished" podID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerID="c6883ef1a1eeee16e4ad80d31f5bfc49ef641b816b999dfd58cbdee3de121392" exitCode=0 Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.765505 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94kb7" event={"ID":"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441","Type":"ContainerDied","Data":"c6883ef1a1eeee16e4ad80d31f5bfc49ef641b816b999dfd58cbdee3de121392"} Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.768989 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:30:23 crc kubenswrapper[4812]: I0131 04:30:23.791630 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" podStartSLOduration=6.79159457 podStartE2EDuration="6.79159457s" podCreationTimestamp="2026-01-31 04:30:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:30:23.786061727 +0000 UTC m=+232.281083402" watchObservedRunningTime="2026-01-31 04:30:23.79159457 +0000 UTC m=+232.286616275" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.293561 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.444241 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-utilities\") pod \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.444313 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8ffw8\" (UniqueName: \"kubernetes.io/projected/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-kube-api-access-8ffw8\") pod \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.444646 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-catalog-content\") pod \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\" (UID: \"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441\") " Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.446668 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-utilities" (OuterVolumeSpecName: "utilities") pod "7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" (UID: "7cfc040d-5e3a-4ee4-a72d-c67c8d51d441"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.456112 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-kube-api-access-8ffw8" (OuterVolumeSpecName: "kube-api-access-8ffw8") pod "7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" (UID: "7cfc040d-5e3a-4ee4-a72d-c67c8d51d441"). InnerVolumeSpecName "kube-api-access-8ffw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.504562 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" (UID: "7cfc040d-5e3a-4ee4-a72d-c67c8d51d441"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.546660 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.546716 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.546737 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8ffw8\" (UniqueName: \"kubernetes.io/projected/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441-kube-api-access-8ffw8\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.776243 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-94kb7" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.776308 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-94kb7" event={"ID":"7cfc040d-5e3a-4ee4-a72d-c67c8d51d441","Type":"ContainerDied","Data":"33dc858c5df0f26a2c7f43867f43daf311a5ae77a3e3ca06e017d8b19e201714"} Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.776417 4812 scope.go:117] "RemoveContainer" containerID="c6883ef1a1eeee16e4ad80d31f5bfc49ef641b816b999dfd58cbdee3de121392" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.801485 4812 scope.go:117] "RemoveContainer" containerID="b2053bfd5d695201d9bffcdc4c6f27913c5f37b27304a9fefaa61bdbfc9e299f" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.814575 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-94kb7"] Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.818766 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-94kb7"] Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.825561 4812 scope.go:117] "RemoveContainer" containerID="d963cad55efc340ffe3867f97e99aa62dace22acfbcc8e929a6aaf2f8f0a019b" Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.881586 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sktpt"] Jan 31 04:30:24 crc kubenswrapper[4812]: I0131 04:30:24.881944 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-sktpt" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="registry-server" containerID="cri-o://4d3a725e3d8169eff4a889017e23a84eae2b00bd4f2c6e88959e05059c69d1f3" gracePeriod=2 Jan 31 04:30:25 crc kubenswrapper[4812]: I0131 04:30:25.796119 4812 generic.go:334] "Generic (PLEG): container finished" podID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerID="4d3a725e3d8169eff4a889017e23a84eae2b00bd4f2c6e88959e05059c69d1f3" exitCode=0 Jan 31 04:30:25 crc kubenswrapper[4812]: I0131 04:30:25.796194 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sktpt" event={"ID":"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6","Type":"ContainerDied","Data":"4d3a725e3d8169eff4a889017e23a84eae2b00bd4f2c6e88959e05059c69d1f3"} Jan 31 04:30:25 crc kubenswrapper[4812]: I0131 04:30:25.989867 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.172195 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-catalog-content\") pod \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.172332 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-g5b6g\" (UniqueName: \"kubernetes.io/projected/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-kube-api-access-g5b6g\") pod \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.172427 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-utilities\") pod \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\" (UID: \"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6\") " Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.173901 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-utilities" (OuterVolumeSpecName: "utilities") pod "e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" (UID: "e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.194180 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-kube-api-access-g5b6g" (OuterVolumeSpecName: "kube-api-access-g5b6g") pod "e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" (UID: "e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6"). InnerVolumeSpecName "kube-api-access-g5b6g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.200640 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" (UID: "e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.274160 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.274224 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-g5b6g\" (UniqueName: \"kubernetes.io/projected/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-kube-api-access-g5b6g\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.274249 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.347722 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" path="/var/lib/kubelet/pods/7cfc040d-5e3a-4ee4-a72d-c67c8d51d441/volumes" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.807788 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-sktpt" event={"ID":"e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6","Type":"ContainerDied","Data":"022a89e8489d0306fb9a757b3466fdabae91fc2b08bfba554cf816d64e3a7d87"} Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.807887 4812 scope.go:117] "RemoveContainer" containerID="4d3a725e3d8169eff4a889017e23a84eae2b00bd4f2c6e88959e05059c69d1f3" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.807886 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-sktpt" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.838367 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-sktpt"] Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.840112 4812 scope.go:117] "RemoveContainer" containerID="0719fc29a76ad28fa8ea510f5749e3ecaf212ffb2f213287312eb8f45b5eb3f5" Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.844826 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-sktpt"] Jan 31 04:30:26 crc kubenswrapper[4812]: I0131 04:30:26.873198 4812 scope.go:117] "RemoveContainer" containerID="f97096db8185c82904d384f77f0b5fcee905aa11744a2342c654e99169a57ff2" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.101317 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerName="oauth-openshift" containerID="cri-o://2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6" gracePeriod=15 Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.270564 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtdpt"] Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.271484 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-jtdpt" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="registry-server" containerID="cri-o://f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e" gracePeriod=2 Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.635418 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.718801 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794360 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-provider-selection\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794414 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-ocp-branding-template\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794437 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-trusted-ca-bundle\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794595 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-router-certs\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794617 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-serving-cert\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794635 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-session\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794657 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dc5wv\" (UniqueName: \"kubernetes.io/projected/5058ec63-1bc0-4113-b436-041e7e1a37f5-kube-api-access-dc5wv\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794685 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-service-ca\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794720 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-idp-0-file-data\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794790 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-login\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794815 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-error\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794865 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-policies\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794907 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-cliconfig\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.794929 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-dir\") pod \"5058ec63-1bc0-4113-b436-041e7e1a37f5\" (UID: \"5058ec63-1bc0-4113-b436-041e7e1a37f5\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.795133 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.796163 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-trusted-ca-bundle" (OuterVolumeSpecName: "v4-0-config-system-trusted-ca-bundle") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.796201 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-cliconfig" (OuterVolumeSpecName: "v4-0-config-system-cliconfig") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-cliconfig". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.796217 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-policies" (OuterVolumeSpecName: "audit-policies") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "audit-policies". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.796665 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-service-ca" (OuterVolumeSpecName: "v4-0-config-system-service-ca") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-service-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.800085 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-serving-cert" (OuterVolumeSpecName: "v4-0-config-system-serving-cert") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.800343 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-session" (OuterVolumeSpecName: "v4-0-config-system-session") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-session". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.800381 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5058ec63-1bc0-4113-b436-041e7e1a37f5-kube-api-access-dc5wv" (OuterVolumeSpecName: "kube-api-access-dc5wv") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "kube-api-access-dc5wv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.800581 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-error" (OuterVolumeSpecName: "v4-0-config-user-template-error") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-user-template-error". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.800866 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-provider-selection" (OuterVolumeSpecName: "v4-0-config-user-template-provider-selection") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-user-template-provider-selection". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.801058 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-login" (OuterVolumeSpecName: "v4-0-config-user-template-login") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-user-template-login". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.801103 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-router-certs" (OuterVolumeSpecName: "v4-0-config-system-router-certs") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-router-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.801358 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-idp-0-file-data" (OuterVolumeSpecName: "v4-0-config-user-idp-0-file-data") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-user-idp-0-file-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.803316 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-ocp-branding-template" (OuterVolumeSpecName: "v4-0-config-system-ocp-branding-template") pod "5058ec63-1bc0-4113-b436-041e7e1a37f5" (UID: "5058ec63-1bc0-4113-b436-041e7e1a37f5"). InnerVolumeSpecName "v4-0-config-system-ocp-branding-template". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.814786 4812 generic.go:334] "Generic (PLEG): container finished" podID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerID="2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6" exitCode=0 Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.814923 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" event={"ID":"5058ec63-1bc0-4113-b436-041e7e1a37f5","Type":"ContainerDied","Data":"2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6"} Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.814960 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" event={"ID":"5058ec63-1bc0-4113-b436-041e7e1a37f5","Type":"ContainerDied","Data":"3708f9e1581d697ee500ade98f36252570fa0dd5943014586b22ddabe9492a5b"} Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.814981 4812 scope.go:117] "RemoveContainer" containerID="2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.815037 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-558db77b4-57sr9" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.819277 4812 generic.go:334] "Generic (PLEG): container finished" podID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerID="f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e" exitCode=0 Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.819388 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-jtdpt" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.819323 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtdpt" event={"ID":"b369d585-140d-46fb-8b27-42f6fdc8817a","Type":"ContainerDied","Data":"f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e"} Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.819521 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-jtdpt" event={"ID":"b369d585-140d-46fb-8b27-42f6fdc8817a","Type":"ContainerDied","Data":"528cd7e8b53114ff61920038f455c2ce5829e71278b1344bea483e3f4cb9e065"} Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.849611 4812 scope.go:117] "RemoveContainer" containerID="2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6" Jan 31 04:30:27 crc kubenswrapper[4812]: E0131 04:30:27.850414 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6\": container with ID starting with 2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6 not found: ID does not exist" containerID="2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.850455 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6"} err="failed to get container status \"2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6\": rpc error: code = NotFound desc = could not find container \"2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6\": container with ID starting with 2c1e914868b971847ed2e10b254403c3a5fe6589da1917a039813c7a06c0b8b6 not found: ID does not exist" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.850500 4812 scope.go:117] "RemoveContainer" containerID="f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.857177 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-57sr9"] Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.869150 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-authentication/oauth-openshift-558db77b4-57sr9"] Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.878206 4812 scope.go:117] "RemoveContainer" containerID="4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896170 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-utilities\") pod \"b369d585-140d-46fb-8b27-42f6fdc8817a\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896218 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-catalog-content\") pod \"b369d585-140d-46fb-8b27-42f6fdc8817a\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896302 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ctmtv\" (UniqueName: \"kubernetes.io/projected/b369d585-140d-46fb-8b27-42f6fdc8817a-kube-api-access-ctmtv\") pod \"b369d585-140d-46fb-8b27-42f6fdc8817a\" (UID: \"b369d585-140d-46fb-8b27-42f6fdc8817a\") " Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896518 4812 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896535 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-provider-selection\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896548 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-trusted-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896557 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-ocp-branding-template\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896567 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-router-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896578 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-session\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896588 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896598 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dc5wv\" (UniqueName: \"kubernetes.io/projected/5058ec63-1bc0-4113-b436-041e7e1a37f5-kube-api-access-dc5wv\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896609 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-service-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896619 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-idp-0-file-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896628 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-login\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896638 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-user-template-error\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896646 4812 reconciler_common.go:293] "Volume detached for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-audit-policies\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.896656 4812 reconciler_common.go:293] "Volume detached for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/5058ec63-1bc0-4113-b436-041e7e1a37f5-v4-0-config-system-cliconfig\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.897749 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-utilities" (OuterVolumeSpecName: "utilities") pod "b369d585-140d-46fb-8b27-42f6fdc8817a" (UID: "b369d585-140d-46fb-8b27-42f6fdc8817a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.899100 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b369d585-140d-46fb-8b27-42f6fdc8817a-kube-api-access-ctmtv" (OuterVolumeSpecName: "kube-api-access-ctmtv") pod "b369d585-140d-46fb-8b27-42f6fdc8817a" (UID: "b369d585-140d-46fb-8b27-42f6fdc8817a"). InnerVolumeSpecName "kube-api-access-ctmtv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.909099 4812 scope.go:117] "RemoveContainer" containerID="f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.932255 4812 scope.go:117] "RemoveContainer" containerID="f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e" Jan 31 04:30:27 crc kubenswrapper[4812]: E0131 04:30:27.932989 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e\": container with ID starting with f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e not found: ID does not exist" containerID="f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.933031 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e"} err="failed to get container status \"f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e\": rpc error: code = NotFound desc = could not find container \"f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e\": container with ID starting with f378d59e33bdffc5dd082253cfe2cc32338079b0034255ddcfcfe8ba21c2ae7e not found: ID does not exist" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.933056 4812 scope.go:117] "RemoveContainer" containerID="4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d" Jan 31 04:30:27 crc kubenswrapper[4812]: E0131 04:30:27.933688 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d\": container with ID starting with 4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d not found: ID does not exist" containerID="4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.933709 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d"} err="failed to get container status \"4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d\": rpc error: code = NotFound desc = could not find container \"4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d\": container with ID starting with 4f66c8542febc066f9270f9ce5ccaf7f2a5cd52dbea78f393d6ace342f62ff8d not found: ID does not exist" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.933721 4812 scope.go:117] "RemoveContainer" containerID="f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270" Jan 31 04:30:27 crc kubenswrapper[4812]: E0131 04:30:27.934120 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270\": container with ID starting with f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270 not found: ID does not exist" containerID="f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.934135 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270"} err="failed to get container status \"f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270\": rpc error: code = NotFound desc = could not find container \"f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270\": container with ID starting with f981c947974314ff2b2f093d4978619d32fa3ad99f981d703eb71a9e3ba99270 not found: ID does not exist" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.998312 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ctmtv\" (UniqueName: \"kubernetes.io/projected/b369d585-140d-46fb-8b27-42f6fdc8817a-kube-api-access-ctmtv\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:27 crc kubenswrapper[4812]: I0131 04:30:27.998356 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.038862 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b369d585-140d-46fb-8b27-42f6fdc8817a" (UID: "b369d585-140d-46fb-8b27-42f6fdc8817a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.100121 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b369d585-140d-46fb-8b27-42f6fdc8817a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.164070 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-jtdpt"] Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.169623 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-jtdpt"] Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.354527 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" path="/var/lib/kubelet/pods/5058ec63-1bc0-4113-b436-041e7e1a37f5/volumes" Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.355972 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" path="/var/lib/kubelet/pods/b369d585-140d-46fb-8b27-42f6fdc8817a/volumes" Jan 31 04:30:28 crc kubenswrapper[4812]: I0131 04:30:28.357370 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" path="/var/lib/kubelet/pods/e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6/volumes" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192377 4812 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192689 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="extract-utilities" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192710 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="extract-utilities" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192732 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192745 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192766 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="extract-utilities" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192779 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="extract-utilities" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192797 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="extract-content" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192809 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="extract-content" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192826 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="extract-utilities" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192889 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="extract-utilities" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192921 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192942 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192960 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="extract-content" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.192977 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="extract-content" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.192997 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="extract-content" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193014 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="extract-content" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.193039 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193054 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.193077 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerName="oauth-openshift" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193089 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerName="oauth-openshift" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193263 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5058ec63-1bc0-4113-b436-041e7e1a37f5" containerName="oauth-openshift" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193294 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b369d585-140d-46fb-8b27-42f6fdc8817a" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193308 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="7cfc040d-5e3a-4ee4-a72d-c67c8d51d441" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193325 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="e4f95a0f-5ac3-4b79-9497-5a8f5d65b9e6" containerName="registry-server" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193759 4812 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.193984 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.194191 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" containerID="cri-o://4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2" gracePeriod=15 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.194274 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" containerID="cri-o://ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2" gracePeriod=15 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.194293 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" containerID="cri-o://f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9" gracePeriod=15 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.194452 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" containerID="cri-o://92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7" gracePeriod=15 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.194523 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" containerID="cri-o://ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c" gracePeriod=15 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195452 4812 kubelet.go:2421] "SyncLoop ADD" source="file" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.195725 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195747 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.195769 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195783 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.195809 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195873 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="setup" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.195896 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195909 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.195929 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195941 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.195961 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.195973 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.196138 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-syncer" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.196159 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.196176 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-check-endpoints" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.196198 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-cert-regeneration-controller" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.196217 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f4b27818a5e8e43d0dc095d08835c792" containerName="kube-apiserver-insecure-readyz" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.247081 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343346 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343409 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343447 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343482 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343671 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343739 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.343899 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.344016 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.444803 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.444894 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.444948 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.444957 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-resource-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.444984 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445087 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445047 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445046 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-cert-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445350 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445391 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445415 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445447 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445474 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445492 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445448 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/71bb4a3aecc4ba5b26c4b7318770ce13-audit-dir\") pod \"kube-apiserver-crc\" (UID: \"71bb4a3aecc4ba5b26c4b7318770ce13\") " pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.445577 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"kube-apiserver-startup-monitor-crc\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.543127 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:30:31 crc kubenswrapper[4812]: W0131 04:30:31.572180 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf85e55b1a89d02b0cb034b1ea31ed45a.slice/crio-1874c97a209f017ccf9128a0ffa5b4d4eddae844396f7b6d1658a2abbc086be8 WatchSource:0}: Error finding container 1874c97a209f017ccf9128a0ffa5b4d4eddae844396f7b6d1658a2abbc086be8: Status 404 returned error can't find the container with id 1874c97a209f017ccf9128a0ffa5b4d4eddae844396f7b6d1658a2abbc086be8 Jan 31 04:30:31 crc kubenswrapper[4812]: E0131 04:30:31.576384 4812 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fb67e2a84e683 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 04:30:31.575381635 +0000 UTC m=+240.070403340,LastTimestamp:2026-01-31 04:30:31.575381635 +0000 UTC m=+240.070403340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.879335 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.880406 4812 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2" exitCode=0 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.880435 4812 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9" exitCode=0 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.880446 4812 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7" exitCode=0 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.880455 4812 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c" exitCode=2 Jan 31 04:30:31 crc kubenswrapper[4812]: I0131 04:30:31.882070 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"1874c97a209f017ccf9128a0ffa5b4d4eddae844396f7b6d1658a2abbc086be8"} Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.322130 4812 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.322813 4812 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.323333 4812 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.323719 4812 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.324200 4812 controller.go:195] "Failed to update lease" err="Put \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: I0131 04:30:32.324252 4812 controller.go:115] "failed to update lease using latest lease, fallback to ensure lease" err="failed 5 attempts to update lease" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.324607 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="200ms" Jan 31 04:30:32 crc kubenswrapper[4812]: I0131 04:30:32.346820 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: I0131 04:30:32.348119 4812 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.525191 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="400ms" Jan 31 04:30:32 crc kubenswrapper[4812]: I0131 04:30:32.891986 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" event={"ID":"f85e55b1a89d02b0cb034b1ea31ed45a","Type":"ContainerStarted","Data":"d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94"} Jan 31 04:30:32 crc kubenswrapper[4812]: I0131 04:30:32.892911 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:32 crc kubenswrapper[4812]: E0131 04:30:32.926678 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="800ms" Jan 31 04:30:33 crc kubenswrapper[4812]: E0131 04:30:33.392065 4812 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC openshift-image-registry/crc-image-registry-storage: failed to fetch PVC from API server: Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-image-registry/persistentvolumeclaims/crc-image-registry-storage\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" volumeName="registry-storage" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.698295 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.699944 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.700641 4812 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.701262 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:33 crc kubenswrapper[4812]: E0131 04:30:33.728241 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="1.6s" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.877758 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878011 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir" (OuterVolumeSpecName: "cert-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "cert-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878119 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878197 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") pod \"f4b27818a5e8e43d0dc095d08835c792\" (UID: \"f4b27818a5e8e43d0dc095d08835c792\") " Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878209 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir" (OuterVolumeSpecName: "audit-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "audit-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878348 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f4b27818a5e8e43d0dc095d08835c792" (UID: "f4b27818a5e8e43d0dc095d08835c792"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878628 4812 reconciler_common.go:293] "Volume detached for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-audit-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878673 4812 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.878694 4812 reconciler_common.go:293] "Volume detached for volume \"cert-dir\" (UniqueName: \"kubernetes.io/host-path/f4b27818a5e8e43d0dc095d08835c792-cert-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.903453 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-crc_f4b27818a5e8e43d0dc095d08835c792/kube-apiserver-cert-syncer/0.log" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.905259 4812 generic.go:334] "Generic (PLEG): container finished" podID="f4b27818a5e8e43d0dc095d08835c792" containerID="4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2" exitCode=0 Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.905424 4812 scope.go:117] "RemoveContainer" containerID="ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.905464 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.930144 4812 scope.go:117] "RemoveContainer" containerID="f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.937710 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.938423 4812 status_manager.go:851] "Failed to get status for pod" podUID="f4b27818a5e8e43d0dc095d08835c792" pod="openshift-kube-apiserver/kube-apiserver-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.954666 4812 scope.go:117] "RemoveContainer" containerID="92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.977404 4812 scope.go:117] "RemoveContainer" containerID="ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c" Jan 31 04:30:33 crc kubenswrapper[4812]: I0131 04:30:33.997246 4812 scope.go:117] "RemoveContainer" containerID="4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.021293 4812 scope.go:117] "RemoveContainer" containerID="412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.047337 4812 scope.go:117] "RemoveContainer" containerID="ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2" Jan 31 04:30:34 crc kubenswrapper[4812]: E0131 04:30:34.047822 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\": container with ID starting with ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2 not found: ID does not exist" containerID="ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.047900 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2"} err="failed to get container status \"ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\": rpc error: code = NotFound desc = could not find container \"ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2\": container with ID starting with ae25658c81a441dabc65e817fa5cc0cf20ad4e42f30000a9a23679c2921a57e2 not found: ID does not exist" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.047939 4812 scope.go:117] "RemoveContainer" containerID="f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9" Jan 31 04:30:34 crc kubenswrapper[4812]: E0131 04:30:34.048530 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\": container with ID starting with f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9 not found: ID does not exist" containerID="f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.048664 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9"} err="failed to get container status \"f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\": rpc error: code = NotFound desc = could not find container \"f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9\": container with ID starting with f8cbb9b593a693c66b3bcbb8ff7390845422764bf25edcdb9aafdc961a50e0e9 not found: ID does not exist" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.048758 4812 scope.go:117] "RemoveContainer" containerID="92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7" Jan 31 04:30:34 crc kubenswrapper[4812]: E0131 04:30:34.049233 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\": container with ID starting with 92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7 not found: ID does not exist" containerID="92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.049277 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7"} err="failed to get container status \"92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\": rpc error: code = NotFound desc = could not find container \"92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7\": container with ID starting with 92d883ce9d78e5496f856ba0b776ba94fa3d28e125402959e0de7ea9f17341a7 not found: ID does not exist" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.049308 4812 scope.go:117] "RemoveContainer" containerID="ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c" Jan 31 04:30:34 crc kubenswrapper[4812]: E0131 04:30:34.049744 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\": container with ID starting with ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c not found: ID does not exist" containerID="ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.049785 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c"} err="failed to get container status \"ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\": rpc error: code = NotFound desc = could not find container \"ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c\": container with ID starting with ccaee3afff893451ea3e55957fa2793a107216ccb8839e9db5aa78185f9ff52c not found: ID does not exist" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.049809 4812 scope.go:117] "RemoveContainer" containerID="4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2" Jan 31 04:30:34 crc kubenswrapper[4812]: E0131 04:30:34.050247 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\": container with ID starting with 4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2 not found: ID does not exist" containerID="4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.050292 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2"} err="failed to get container status \"4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\": rpc error: code = NotFound desc = could not find container \"4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2\": container with ID starting with 4d037d814c3e960d9fbc9c7a78898a663022dc18109db2fef5b97e9df79c26c2 not found: ID does not exist" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.050329 4812 scope.go:117] "RemoveContainer" containerID="412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3" Jan 31 04:30:34 crc kubenswrapper[4812]: E0131 04:30:34.050771 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\": container with ID starting with 412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3 not found: ID does not exist" containerID="412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.050810 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3"} err="failed to get container status \"412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\": rpc error: code = NotFound desc = could not find container \"412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3\": container with ID starting with 412dbd1d2404b9335e1158ab0e59cbc03c080d88170fd1b3bc1c35b61386aef3 not found: ID does not exist" Jan 31 04:30:34 crc kubenswrapper[4812]: I0131 04:30:34.351124 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f4b27818a5e8e43d0dc095d08835c792" path="/var/lib/kubelet/pods/f4b27818a5e8e43d0dc095d08835c792/volumes" Jan 31 04:30:35 crc kubenswrapper[4812]: E0131 04:30:35.329955 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="3.2s" Jan 31 04:30:36 crc kubenswrapper[4812]: I0131 04:30:36.925326 4812 generic.go:334] "Generic (PLEG): container finished" podID="550755c5-af9b-446d-a3ad-d5afc6264e89" containerID="693bed728ba21f37217257e0a7b27a629eb1be4e6d2d5395a94bba1c8cb0122b" exitCode=0 Jan 31 04:30:36 crc kubenswrapper[4812]: I0131 04:30:36.925419 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"550755c5-af9b-446d-a3ad-d5afc6264e89","Type":"ContainerDied","Data":"693bed728ba21f37217257e0a7b27a629eb1be4e6d2d5395a94bba1c8cb0122b"} Jan 31 04:30:36 crc kubenswrapper[4812]: I0131 04:30:36.926330 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:36 crc kubenswrapper[4812]: I0131 04:30:36.926741 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.361455 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.362488 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.363140 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:38 crc kubenswrapper[4812]: E0131 04:30:38.532670 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="6.4s" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.541340 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550755c5-af9b-446d-a3ad-d5afc6264e89-kube-api-access\") pod \"550755c5-af9b-446d-a3ad-d5afc6264e89\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.541589 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-kubelet-dir\") pod \"550755c5-af9b-446d-a3ad-d5afc6264e89\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.541743 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-var-lock\") pod \"550755c5-af9b-446d-a3ad-d5afc6264e89\" (UID: \"550755c5-af9b-446d-a3ad-d5afc6264e89\") " Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.541664 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-kubelet-dir" (OuterVolumeSpecName: "kubelet-dir") pod "550755c5-af9b-446d-a3ad-d5afc6264e89" (UID: "550755c5-af9b-446d-a3ad-d5afc6264e89"). InnerVolumeSpecName "kubelet-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.542366 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-var-lock" (OuterVolumeSpecName: "var-lock") pod "550755c5-af9b-446d-a3ad-d5afc6264e89" (UID: "550755c5-af9b-446d-a3ad-d5afc6264e89"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.542572 4812 reconciler_common.go:293] "Volume detached for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-kubelet-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.549041 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/550755c5-af9b-446d-a3ad-d5afc6264e89-kube-api-access" (OuterVolumeSpecName: "kube-api-access") pod "550755c5-af9b-446d-a3ad-d5afc6264e89" (UID: "550755c5-af9b-446d-a3ad-d5afc6264e89"). InnerVolumeSpecName "kube-api-access". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.644295 4812 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/550755c5-af9b-446d-a3ad-d5afc6264e89-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.644405 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access\" (UniqueName: \"kubernetes.io/projected/550755c5-af9b-446d-a3ad-d5afc6264e89-kube-api-access\") on node \"crc\" DevicePath \"\"" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.945284 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/installer-9-crc" event={"ID":"550755c5-af9b-446d-a3ad-d5afc6264e89","Type":"ContainerDied","Data":"b9b0daa9efe5b7b2b4104f6a9373ad2c27ca8fae639cbad9a4a033af0b560b81"} Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.945344 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b9b0daa9efe5b7b2b4104f6a9373ad2c27ca8fae639cbad9a4a033af0b560b81" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.945419 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/installer-9-crc" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.968398 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:38 crc kubenswrapper[4812]: I0131 04:30:38.969080 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:39 crc kubenswrapper[4812]: E0131 04:30:39.820446 4812 event.go:368] "Unable to write event (may retry after sleeping)" err="Post \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/events\": dial tcp 38.102.83.238:6443: connect: connection refused" event="&Event{ObjectMeta:{kube-apiserver-startup-monitor-crc.188fb67e2a84e683 openshift-kube-apiserver 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Pod,Namespace:openshift-kube-apiserver,Name:kube-apiserver-startup-monitor-crc,UID:f85e55b1a89d02b0cb034b1ea31ed45a,APIVersion:v1,ResourceVersion:,FieldPath:spec.containers{startup-monitor},},Reason:Pulled,Message:Container image \"quay.io/crcont/openshift-crc-cluster-kube-apiserver-operator@sha256:9f36dc276e27753fc478274c7f7814a4f8945c987117ee1ea3b8e6355e6d7462\" already present on machine,Source:EventSource{Component:kubelet,Host:crc,},FirstTimestamp:2026-01-31 04:30:31.575381635 +0000 UTC m=+240.070403340,LastTimestamp:2026-01-31 04:30:31.575381635 +0000 UTC m=+240.070403340,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:crc,}" Jan 31 04:30:42 crc kubenswrapper[4812]: I0131 04:30:42.343566 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:42 crc kubenswrapper[4812]: I0131 04:30:42.344899 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.983420 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.983504 4812 generic.go:334] "Generic (PLEG): container finished" podID="f614b9022728cf315e60c057852e563e" containerID="71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792" exitCode=1 Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.983550 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerDied","Data":"71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792"} Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.984232 4812 scope.go:117] "RemoveContainer" containerID="71c2e5c81e1195477c351bd8e2bf3a2f1d2715ac4623444af7b2958d24db8792" Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.984537 4812 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.984920 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:43 crc kubenswrapper[4812]: I0131 04:30:43.985429 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:44 crc kubenswrapper[4812]: I0131 04:30:44.733876 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:30:44 crc kubenswrapper[4812]: E0131 04:30:44.933469 4812 controller.go:145] "Failed to ensure lease exists, will retry" err="Get \"https://api-int.crc.testing:6443/apis/coordination.k8s.io/v1/namespaces/kube-node-lease/leases/crc?timeout=10s\": dial tcp 38.102.83.238:6443: connect: connection refused" interval="7s" Jan 31 04:30:44 crc kubenswrapper[4812]: I0131 04:30:44.994972 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-controller-manager_kube-controller-manager-crc_f614b9022728cf315e60c057852e563e/kube-controller-manager/0.log" Jan 31 04:30:44 crc kubenswrapper[4812]: I0131 04:30:44.995068 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-controller-manager/kube-controller-manager-crc" event={"ID":"f614b9022728cf315e60c057852e563e","Type":"ContainerStarted","Data":"bccc1f34446c5da6eec0782aeb2d7d827d9e661f3651661ed5c7e501d950fca2"} Jan 31 04:30:44 crc kubenswrapper[4812]: I0131 04:30:44.996036 4812 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:44 crc kubenswrapper[4812]: I0131 04:30:44.996508 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:44 crc kubenswrapper[4812]: I0131 04:30:44.997091 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.338717 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.339617 4812 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.340270 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.340966 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.362713 4812 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.362751 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:45 crc kubenswrapper[4812]: E0131 04:30:45.363334 4812 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:45 crc kubenswrapper[4812]: I0131 04:30:45.364014 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:45 crc kubenswrapper[4812]: W0131 04:30:45.395178 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod71bb4a3aecc4ba5b26c4b7318770ce13.slice/crio-e90c675c055d276d06fa68aa171bfd8e50d538a03872200c57ab5805fbf9df8f WatchSource:0}: Error finding container e90c675c055d276d06fa68aa171bfd8e50d538a03872200c57ab5805fbf9df8f: Status 404 returned error can't find the container with id e90c675c055d276d06fa68aa171bfd8e50d538a03872200c57ab5805fbf9df8f Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.008168 4812 generic.go:334] "Generic (PLEG): container finished" podID="71bb4a3aecc4ba5b26c4b7318770ce13" containerID="28e20f4edbf0733bbe8dbb2af50914252d59669ca8ffa6cf69293ca18c19fa6d" exitCode=0 Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.008287 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerDied","Data":"28e20f4edbf0733bbe8dbb2af50914252d59669ca8ffa6cf69293ca18c19fa6d"} Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.008618 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"e90c675c055d276d06fa68aa171bfd8e50d538a03872200c57ab5805fbf9df8f"} Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.009401 4812 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.009445 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.010029 4812 status_manager.go:851] "Failed to get status for pod" podUID="f614b9022728cf315e60c057852e563e" pod="openshift-kube-controller-manager/kube-controller-manager-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-controller-manager/pods/kube-controller-manager-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:46 crc kubenswrapper[4812]: E0131 04:30:46.010105 4812 mirror_client.go:138] "Failed deleting a mirror pod" err="Delete \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.010602 4812 status_manager.go:851] "Failed to get status for pod" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" pod="openshift-kube-apiserver/installer-9-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/installer-9-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:46 crc kubenswrapper[4812]: I0131 04:30:46.011081 4812 status_manager.go:851] "Failed to get status for pod" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" err="Get \"https://api-int.crc.testing:6443/api/v1/namespaces/openshift-kube-apiserver/pods/kube-apiserver-startup-monitor-crc\": dial tcp 38.102.83.238:6443: connect: connection refused" Jan 31 04:30:47 crc kubenswrapper[4812]: I0131 04:30:47.017888 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"34af39aaa7adc64f9ecf5ef7b451b98832a1250a8d554391d1fd9fec2a3302fa"} Jan 31 04:30:47 crc kubenswrapper[4812]: I0131 04:30:47.018130 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"4740c32a845281d294612cc2391c2306615ab8a39fb7e2d1be58b8c93a374afc"} Jan 31 04:30:47 crc kubenswrapper[4812]: I0131 04:30:47.018140 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"79aa006098168b2f1768dc3ef640c754369e60da1f49b792a57fea25a77d4a58"} Jan 31 04:30:48 crc kubenswrapper[4812]: I0131 04:30:48.025483 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"b17ea85697a676ab9e966d9302bb6b4b60a9ccd9efe0a9fbc09a404ca69a2c8d"} Jan 31 04:30:48 crc kubenswrapper[4812]: I0131 04:30:48.025530 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-kube-apiserver/kube-apiserver-crc" event={"ID":"71bb4a3aecc4ba5b26c4b7318770ce13","Type":"ContainerStarted","Data":"9f2964a32c0fe07e653fca61fd2cb3522468773a608de70e503e824a8b467eee"} Jan 31 04:30:48 crc kubenswrapper[4812]: I0131 04:30:48.025781 4812 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:48 crc kubenswrapper[4812]: I0131 04:30:48.025797 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:48 crc kubenswrapper[4812]: I0131 04:30:48.026023 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:49 crc kubenswrapper[4812]: I0131 04:30:49.166948 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:30:49 crc kubenswrapper[4812]: I0131 04:30:49.174690 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:30:50 crc kubenswrapper[4812]: I0131 04:30:50.036422 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:30:50 crc kubenswrapper[4812]: I0131 04:30:50.364988 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:50 crc kubenswrapper[4812]: I0131 04:30:50.365968 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:50 crc kubenswrapper[4812]: I0131 04:30:50.370864 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:53 crc kubenswrapper[4812]: I0131 04:30:53.037071 4812 kubelet.go:1914] "Deleted mirror pod because it is outdated" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:53 crc kubenswrapper[4812]: I0131 04:30:53.064169 4812 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:53 crc kubenswrapper[4812]: I0131 04:30:53.064203 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:53 crc kubenswrapper[4812]: I0131 04:30:53.068414 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:30:53 crc kubenswrapper[4812]: I0131 04:30:53.130328 4812 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e81ae5c6-3f9c-440f-a5b5-1dde2a967746" Jan 31 04:30:54 crc kubenswrapper[4812]: I0131 04:30:54.070328 4812 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:54 crc kubenswrapper[4812]: I0131 04:30:54.070952 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:30:54 crc kubenswrapper[4812]: I0131 04:30:54.082320 4812 status_manager.go:861] "Pod was deleted and then recreated, skipping status update" pod="openshift-kube-apiserver/kube-apiserver-crc" oldPodUID="71bb4a3aecc4ba5b26c4b7318770ce13" podUID="e81ae5c6-3f9c-440f-a5b5-1dde2a967746" Jan 31 04:30:54 crc kubenswrapper[4812]: I0131 04:30:54.739467 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-controller-manager/kube-controller-manager-crc" Jan 31 04:31:01 crc kubenswrapper[4812]: I0131 04:31:01.900484 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-root-ca.crt" Jan 31 04:31:02 crc kubenswrapper[4812]: I0131 04:31:02.427813 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"kube-storage-version-migrator-operator-dockercfg-2bh8d" Jan 31 04:31:02 crc kubenswrapper[4812]: I0131 04:31:02.460641 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"kube-root-ca.crt" Jan 31 04:31:03 crc kubenswrapper[4812]: I0131 04:31:03.266079 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"openshift-service-ca.crt" Jan 31 04:31:03 crc kubenswrapper[4812]: I0131 04:31:03.683808 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"kube-root-ca.crt" Jan 31 04:31:03 crc kubenswrapper[4812]: I0131 04:31:03.742991 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"node-bootstrapper-token" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.007998 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-node-metrics-cert" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.095129 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"trusted-ca" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.099682 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-console"/"networking-console-plugin-cert" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.214977 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.263106 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"trusted-ca" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.383563 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-dockercfg-jwfmh" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.390483 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-root-ca.crt" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.394655 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"trusted-ca" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.520705 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-serving-cert" Jan 31 04:31:04 crc kubenswrapper[4812]: I0131 04:31:04.876717 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-secret" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.346799 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-serving-cert" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.549052 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"encryption-config-1" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.587292 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.660210 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"openshift-service-ca.crt" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.752571 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.866373 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-dockercfg-r9srn" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.895725 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"proxy-tls" Jan 31 04:31:05 crc kubenswrapper[4812]: I0131 04:31:05.899040 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"kube-root-ca.crt" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.004108 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"etcd-client" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.053083 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.099694 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"trusted-ca-bundle" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.120625 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"serving-cert" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.272198 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"openshift-service-ca.crt" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.297434 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"kube-scheduler-operator-serving-cert" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.377828 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"default-dockercfg-gxtc4" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.550946 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"ovnkube-identity-cm" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.582659 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"openshift-service-ca.crt" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.599744 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-metrics-certs-default" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.619868 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"kube-root-ca.crt" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.675413 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-tls" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.797083 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"node-ca-dockercfg-4777p" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.799637 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"openshift-apiserver-sa-dockercfg-djjff" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.845113 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-control-plane-metrics-cert" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.849172 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-dockercfg-vw8fw" Jan 31 04:31:06 crc kubenswrapper[4812]: I0131 04:31:06.962689 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-tls" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.004581 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-root-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.035060 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-dockercfg-k9rxt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.137617 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"kube-root-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.145544 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-operator"/"metrics-tls" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.236124 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"service-ca-dockercfg-pn86c" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.257525 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.297484 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-sa-dockercfg-nl2j4" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.327517 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"metrics-tls" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.334006 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"machine-approver-config" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.344925 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"serving-cert" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.373035 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"openshift-service-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.432374 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"kube-root-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.493578 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-dockercfg-qt55r" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.503579 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"kube-root-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.599967 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"kube-root-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.602388 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-rbac-proxy" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.649820 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"kube-rbac-proxy" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.663444 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"default-dockercfg-chnjx" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.697204 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"service-ca-bundle" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.716731 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"etcd-serving-ca" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.738818 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"openshift-service-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.798507 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"kube-root-ca.crt" Jan 31 04:31:07 crc kubenswrapper[4812]: I0131 04:31:07.890245 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"openshift-service-ca.crt" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.083547 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"kube-root-ca.crt" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.218540 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"samples-operator-tls" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.400362 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"trusted-ca-bundle" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.552259 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"openshift-service-ca.crt" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.615252 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"audit-1" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.617124 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"encryption-config-1" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.641686 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-config" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.646488 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-scheduler-operator"/"openshift-kube-scheduler-operator-config" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.752624 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.760338 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"package-server-manager-serving-cert" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.806781 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"openshift-service-ca.crt" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.890245 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-stats-default" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.933161 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"openshift-service-ca.crt" Jan 31 04:31:08 crc kubenswrapper[4812]: I0131 04:31:08.949692 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"kube-root-ca.crt" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.087701 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"audit-1" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.138716 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"kube-root-ca.crt" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.204936 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator-operator"/"serving-cert" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.241370 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-operator-serving-cert" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.268330 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"kube-root-ca.crt" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.341630 4812 reflector.go:368] Caches populated for *v1.Service from k8s.io/client-go/informers/factory.go:160 Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.407952 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"metrics-daemon-sa-dockercfg-d427c" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.445631 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"oauth-serving-cert" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.534248 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-daemon-dockercfg-r5tcq" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.576091 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-console"/"networking-console-plugin" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.752658 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"config" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.773952 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-version"/"cluster-version-operator-serving-cert" Jan 31 04:31:09 crc kubenswrapper[4812]: I0131 04:31:09.857300 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-admission-controller-secret" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.003737 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"default-dockercfg-2llfx" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.005959 4812 reflector.go:368] Caches populated for *v1.Node from k8s.io/client-go/informers/factory.go:160 Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.039674 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"openshift-service-ca.crt" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.139264 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"default-cni-sysctl-allowlist" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.330293 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-network-node-identity"/"network-node-identity-cert" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.376020 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator-operator"/"openshift-service-ca.crt" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.488740 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"machine-config-operator-images" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.490147 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"dns-default-metrics-tls" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.500633 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"trusted-ca-bundle" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.505353 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication-operator"/"authentication-operator-dockercfg-mz9bj" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.528471 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"packageserver-service-cert" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.570482 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.588803 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"serving-cert" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.590131 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress"/"kube-root-ca.crt" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.868865 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-config-operator"/"kube-root-ca.crt" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.905633 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"catalog-operator-serving-cert" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.906189 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca"/"signing-key" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.946145 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"kube-root-ca.crt" Jan 31 04:31:10 crc kubenswrapper[4812]: I0131 04:31:10.980290 4812 reflector.go:368] Caches populated for *v1.CSIDriver from k8s.io/client-go/informers/factory.go:160 Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.022181 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-certs-default" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.067985 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-operator-dockercfg-98p87" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.147753 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mcc-proxy-tls" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.266048 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"authentication-operator-config" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.317004 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"kube-rbac-proxy" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.353318 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-samples-operator"/"openshift-service-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.387441 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"kube-root-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.387642 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"kube-root-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.390782 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"config-operator-serving-cert" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.429631 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns"/"dns-default" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.430683 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-service-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.525668 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"pprof-cert" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.572147 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-node-dockercfg-pwtwl" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.582393 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.601319 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-canary"/"canary-serving-cert" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.710932 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.741398 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"service-ca-bundle" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.810751 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"mco-proxy-tls" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.901620 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"openshift-service-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.901659 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"etcd-serving-ca" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.905537 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"openshift-service-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.923585 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-operator-config" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.989887 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:31:11 crc kubenswrapper[4812]: I0131 04:31:11.990264 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-config" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.111203 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"console-operator-dockercfg-4xjcr" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.177129 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"openshift-service-ca.crt" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.248920 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"control-plane-machine-set-operator-tls" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.311678 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"kube-root-ca.crt" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.409401 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"metrics-tls" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.478633 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-apiserver-operator"/"kube-root-ca.crt" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.486333 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-config" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.578460 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-machine-approver"/"openshift-service-ca.crt" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.594302 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-controller-dockercfg-c2lfx" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.714225 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-api"/"machine-api-operator-dockercfg-mfbb7" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.727773 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"iptables-alerter-script" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.733431 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"console-config" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.741999 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"registry-dockercfg-kzzsd" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.850251 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"signing-cabundle" Jan 31 04:31:12 crc kubenswrapper[4812]: I0131 04:31:12.992577 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.035624 4812 reflector.go:368] Caches populated for *v1.RuntimeClass from k8s.io/client-go/informers/factory.go:160 Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.059598 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-config-operator"/"openshift-service-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.121465 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"openshift-service-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.214260 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"openshift-service-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.254611 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"default-dockercfg-2q5b6" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.256024 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-operator-tls" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.317965 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-service-ca-bundle" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.396141 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-operator"/"openshift-service-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.472050 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"openshift-service-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.474829 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"openshift-service-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.477111 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"config" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.715257 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-dns-operator"/"kube-root-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.801112 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"kube-root-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.802228 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns-operator"/"dns-operator-dockercfg-9mqw5" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.810315 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"env-overrides" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.834474 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-dockercfg-f62pw" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.872623 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-etcd-operator"/"etcd-ca-bundle" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.883990 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-diagnostics"/"kube-root-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.962788 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager-operator"/"kube-root-ca.crt" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.995269 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress"/"router-dockercfg-zdk86" Jan 31 04:31:13 crc kubenswrapper[4812]: I0131 04:31:13.995545 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-samples-operator"/"cluster-samples-operator-dockercfg-xpp9w" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.022395 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serviceaccount-dockercfg-rq7zk" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.056479 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"kube-root-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.100957 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-config" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.140604 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-node-identity"/"env-overrides" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.216912 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-config-operator"/"openshift-config-operator-dockercfg-7pc5z" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.241693 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ancillary-tools-dockercfg-vnmsz" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.373732 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"serving-cert" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.377943 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"openshift-service-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.395359 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"kube-root-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.395572 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"kube-root-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.413653 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"image-import-ca" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.420208 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-metrics" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.445962 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.570975 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"hostpath-provisioner"/"kube-root-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.673953 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.694369 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-image-registry"/"image-registry-certificates" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.696548 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.719721 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-kube-storage-version-migrator"/"openshift-service-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.736204 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-service-ca-operator"/"service-ca-operator-dockercfg-rg9jl" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.755964 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ovn-kubernetes"/"ovnkube-script-lib" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.756896 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"installation-pull-secrets" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.812939 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver"/"etcd-client" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.826016 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-cluster-version"/"openshift-service-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.835621 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-machine-api"/"machine-api-operator-images" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.854515 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"kube-root-ca.crt" Jan 31 04:31:14 crc kubenswrapper[4812]: I0131 04:31:14.884013 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"image-registry-tls" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.209997 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-ingress-canary"/"kube-root-ca.crt" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.211097 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-image-registry"/"cluster-image-registry-operator-dockercfg-m4qtx" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.269512 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager-operator"/"openshift-controller-manager-operator-serving-cert" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.431616 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-oauth-apiserver"/"openshift-service-ca.crt" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.438593 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"trusted-ca-bundle" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.528551 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca"/"openshift-service-ca.crt" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.615936 4812 reflector.go:368] Caches populated for *v1.Pod from pkg/kubelet/config/apiserver.go:66 Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.618834 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podStartSLOduration=44.618804727 podStartE2EDuration="44.618804727s" podCreationTimestamp="2026-01-31 04:30:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:30:53.090271811 +0000 UTC m=+261.585293476" watchObservedRunningTime="2026-01-31 04:31:15.618804727 +0000 UTC m=+284.113826422" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.626732 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc"] Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.627016 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-kube-apiserver/kube-apiserver-crc","openshift-authentication/oauth-openshift-75ddf57cb8-snhfn"] Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.627417 4812 kubelet.go:1909] "Trying to delete pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.627454 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-crc" podUID="6cb4c886-f070-4393-9d9f-9bf9878fcac2" Jan 31 04:31:15 crc kubenswrapper[4812]: E0131 04:31:15.627694 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" containerName="installer" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.627744 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" containerName="installer" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.628006 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="550755c5-af9b-446d-a3ad-d5afc6264e89" containerName="installer" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.628695 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.643454 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-provider-selection" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644254 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-router-certs" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644345 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"openshift-service-ca.crt" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644454 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-idp-0-file-data" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644556 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-error" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644595 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"audit" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644650 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-serving-cert" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644716 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-cliconfig" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.644782 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"oauth-openshift-dockercfg-znhcc" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.645351 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-session" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.649771 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-kube-apiserver/kube-apiserver-crc" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.652602 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-service-ca" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.654076 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"kube-root-ca.crt" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.659211 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-user-template-login" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.659983 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-dockercfg-xtcjv" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.661493 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication"/"v4-0-config-system-trusted-ca-bundle" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.672587 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-authentication"/"v4-0-config-system-ocp-branding-template" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.684043 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-apiserver/kube-apiserver-crc" podStartSLOduration=22.684017267 podStartE2EDuration="22.684017267s" podCreationTimestamp="2026-01-31 04:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:31:15.676694438 +0000 UTC m=+284.171716163" watchObservedRunningTime="2026-01-31 04:31:15.684017267 +0000 UTC m=+284.179038992" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.694662 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console"/"service-ca" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704671 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704720 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-error\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704769 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704798 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704826 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nd7hh\" (UniqueName: \"kubernetes.io/projected/b58267f5-6c56-4cc8-89a3-c89233cb9f59-kube-api-access-nd7hh\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704879 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-session\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704901 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-audit-policies\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704930 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704956 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704978 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.704996 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-login\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.705023 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.705047 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.705081 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b58267f5-6c56-4cc8-89a3-c89233cb9f59-audit-dir\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.708771 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console-operator"/"serving-cert" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.805673 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.805979 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806074 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806166 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-login\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806302 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806473 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806571 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b58267f5-6c56-4cc8-89a3-c89233cb9f59-audit-dir\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806649 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806730 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-error\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806649 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-dir\" (UniqueName: \"kubernetes.io/host-path/b58267f5-6c56-4cc8-89a3-c89233cb9f59-audit-dir\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.806986 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.807107 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.807209 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-session\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.807308 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nd7hh\" (UniqueName: \"kubernetes.io/projected/b58267f5-6c56-4cc8-89a3-c89233cb9f59-kube-api-access-nd7hh\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.807401 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-audit-policies\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.808464 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-cliconfig\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-cliconfig\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.808744 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-service-ca\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-service-ca\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.809258 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-trusted-ca-bundle\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.809399 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"audit-policies\" (UniqueName: \"kubernetes.io/configmap/b58267f5-6c56-4cc8-89a3-c89233cb9f59-audit-policies\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.812901 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-router-certs\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-router-certs\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.812905 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-idp-0-file-data\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-idp-0-file-data\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.813595 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-session\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-session\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.813824 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-serving-cert\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-serving-cert\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.814599 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-provider-selection\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-provider-selection\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.816166 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-login\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-login\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.819590 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-user-template-error\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-user-template-error\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.821396 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"v4-0-config-system-ocp-branding-template\" (UniqueName: \"kubernetes.io/secret/b58267f5-6c56-4cc8-89a3-c89233cb9f59-v4-0-config-system-ocp-branding-template\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.841639 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nd7hh\" (UniqueName: \"kubernetes.io/projected/b58267f5-6c56-4cc8-89a3-c89233cb9f59-kube-api-access-nd7hh\") pod \"oauth-openshift-75ddf57cb8-snhfn\" (UID: \"b58267f5-6c56-4cc8-89a3-c89233cb9f59\") " pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.930194 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"kube-root-ca.crt" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.962457 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-dockercfg-x57mr" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.964714 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:15 crc kubenswrapper[4812]: I0131 04:31:15.970109 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-service-ca-operator"/"service-ca-operator-config" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.088161 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-controller-manager-operator"/"kube-controller-manager-operator-dockercfg-gkqpw" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.092673 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-cluster-machine-approver"/"machine-approver-tls" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.228931 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-authentication-operator"/"kube-root-ca.crt" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.376666 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"olm-operator-serving-cert" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.428291 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.469892 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-authentication/oauth-openshift-75ddf57cb8-snhfn"] Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.580437 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-storage-version-migrator"/"kube-storage-version-migrator-sa-dockercfg-5xfcg" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.589605 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-oauth-config" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.672205 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-oauth-apiserver"/"oauth-apiserver-sa-dockercfg-6r2bq" Jan 31 04:31:16 crc kubenswrapper[4812]: I0131 04:31:16.743752 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-etcd-operator"/"etcd-client" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.139572 4812 reflector.go:368] Caches populated for *v1.Secret from object-"hostpath-provisioner"/"csi-hostpath-provisioner-sa-dockercfg-qd74k" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.195148 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"multus-daemon-config" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.219166 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" event={"ID":"b58267f5-6c56-4cc8-89a3-c89233cb9f59","Type":"ContainerStarted","Data":"a382751d78f65ee35dd30f67265d844e705975a057d0c664f617f5a72cfb1b79"} Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.219253 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" event={"ID":"b58267f5-6c56-4cc8-89a3-c89233cb9f59","Type":"ContainerStarted","Data":"c5f6f277bfc4cfef90224c07846708f4e0b94f55bac032a8a9beb8f80c1115aa"} Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.219814 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.233037 4812 patch_prober.go:28] interesting pod/oauth-openshift-75ddf57cb8-snhfn container/oauth-openshift namespace/openshift-authentication: Readiness probe status=failure output="Get \"https://10.217.0.64:6443/healthz\": read tcp 10.217.0.2:55462->10.217.0.64:6443: read: connection reset by peer" start-of-body= Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.233121 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" podUID="b58267f5-6c56-4cc8-89a3-c89233cb9f59" containerName="oauth-openshift" probeResult="failure" output="Get \"https://10.217.0.64:6443/healthz\": read tcp 10.217.0.2:55462->10.217.0.64:6443: read: connection reset by peer" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.254296 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" podStartSLOduration=75.254262252 podStartE2EDuration="1m15.254262252s" podCreationTimestamp="2026-01-31 04:30:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:31:17.25124395 +0000 UTC m=+285.746265615" watchObservedRunningTime="2026-01-31 04:31:17.254262252 +0000 UTC m=+285.749283977" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.270538 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-console-operator"/"console-operator-config" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.412520 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.431597 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-marketplace"/"marketplace-trusted-ca" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.683294 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-multus"/"multus-ac-dockercfg-9lkdf" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.713002 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-kube-apiserver-operator"/"kube-apiserver-operator-serving-cert" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.886922 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-network-operator"/"openshift-service-ca.crt" Jan 31 04:31:17 crc kubenswrapper[4812]: I0131 04:31:17.949877 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-machine-config-operator"/"machine-config-server-dockercfg-qx5rd" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.069255 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver"/"openshift-service-ca.crt" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.170593 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-console"/"console-serving-cert" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.228742 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-75ddf57cb8-snhfn_b58267f5-6c56-4cc8-89a3-c89233cb9f59/oauth-openshift/0.log" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.228804 4812 generic.go:334] "Generic (PLEG): container finished" podID="b58267f5-6c56-4cc8-89a3-c89233cb9f59" containerID="a382751d78f65ee35dd30f67265d844e705975a057d0c664f617f5a72cfb1b79" exitCode=255 Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.228853 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" event={"ID":"b58267f5-6c56-4cc8-89a3-c89233cb9f59","Type":"ContainerDied","Data":"a382751d78f65ee35dd30f67265d844e705975a057d0c664f617f5a72cfb1b79"} Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.229647 4812 scope.go:117] "RemoveContainer" containerID="a382751d78f65ee35dd30f67265d844e705975a057d0c664f617f5a72cfb1b79" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.259399 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ovn-kubernetes"/"ovn-kubernetes-control-plane-dockercfg-gs7dd" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.423428 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-multus"/"cni-copy-resources" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.731312 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-ingress-operator"/"ingress-operator-dockercfg-7lnqk" Jan 31 04:31:18 crc kubenswrapper[4812]: I0131 04:31:18.748529 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-service-ca.crt" Jan 31 04:31:19 crc kubenswrapper[4812]: I0131 04:31:19.219918 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"marketplace-operator-dockercfg-5nsgg" Jan 31 04:31:19 crc kubenswrapper[4812]: I0131 04:31:19.239767 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-authentication_oauth-openshift-75ddf57cb8-snhfn_b58267f5-6c56-4cc8-89a3-c89233cb9f59/oauth-openshift/0.log" Jan 31 04:31:19 crc kubenswrapper[4812]: I0131 04:31:19.239878 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" event={"ID":"b58267f5-6c56-4cc8-89a3-c89233cb9f59","Type":"ContainerStarted","Data":"fc25e0579d22df5a5810f97895dbddda64e10955e207dff4255a04addc499ed8"} Jan 31 04:31:19 crc kubenswrapper[4812]: I0131 04:31:19.240251 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:19 crc kubenswrapper[4812]: I0131 04:31:19.248119 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-authentication/oauth-openshift-75ddf57cb8-snhfn" Jan 31 04:31:19 crc kubenswrapper[4812]: I0131 04:31:19.369903 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-dns"/"node-resolver-dockercfg-kz9s7" Jan 31 04:31:20 crc kubenswrapper[4812]: I0131 04:31:20.306125 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:31:26 crc kubenswrapper[4812]: I0131 04:31:26.766616 4812 kubelet.go:2431] "SyncLoop REMOVE" source="file" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 04:31:26 crc kubenswrapper[4812]: I0131 04:31:26.767522 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" containerID="cri-o://d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94" gracePeriod=5 Jan 31 04:31:31 crc kubenswrapper[4812]: I0131 04:31:31.896375 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 04:31:31 crc kubenswrapper[4812]: I0131 04:31:31.896998 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.051696 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.051760 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.051873 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.051899 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.051933 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") pod \"f85e55b1a89d02b0cb034b1ea31ed45a\" (UID: \"f85e55b1a89d02b0cb034b1ea31ed45a\") " Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.052234 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log" (OuterVolumeSpecName: "var-log") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.052782 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests" (OuterVolumeSpecName: "manifests") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "manifests". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.052858 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock" (OuterVolumeSpecName: "var-lock") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "var-lock". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.052893 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir" (OuterVolumeSpecName: "resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.060617 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir" (OuterVolumeSpecName: "pod-resource-dir") pod "f85e55b1a89d02b0cb034b1ea31ed45a" (UID: "f85e55b1a89d02b0cb034b1ea31ed45a"). InnerVolumeSpecName "pod-resource-dir". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.121553 4812 cert_rotation.go:91] certificate rotation detected, shutting down client connections to start using new credentials Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.153175 4812 reconciler_common.go:293] "Volume detached for volume \"manifests\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-manifests\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.153215 4812 reconciler_common.go:293] "Volume detached for volume \"var-lock\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-lock\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.153234 4812 reconciler_common.go:293] "Volume detached for volume \"resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.153249 4812 reconciler_common.go:293] "Volume detached for volume \"var-log\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-var-log\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.153265 4812 reconciler_common.go:293] "Volume detached for volume \"pod-resource-dir\" (UniqueName: \"kubernetes.io/host-path/f85e55b1a89d02b0cb034b1ea31ed45a-pod-resource-dir\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.361562 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" path="/var/lib/kubelet/pods/f85e55b1a89d02b0cb034b1ea31ed45a/volumes" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.362178 4812 mirror_client.go:130] "Deleting a mirror pod" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" podUID="" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.363910 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-apiserver_kube-apiserver-startup-monitor-crc_f85e55b1a89d02b0cb034b1ea31ed45a/startup-monitor/0.log" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.363968 4812 generic.go:334] "Generic (PLEG): container finished" podID="f85e55b1a89d02b0cb034b1ea31ed45a" containerID="d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94" exitCode=137 Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.364038 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.382209 4812 scope.go:117] "RemoveContainer" containerID="d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.385894 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.385956 4812 kubelet.go:2649] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2efddcee-456b-4227-9c8d-5b359876e408" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.410525 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-kube-apiserver/kube-apiserver-startup-monitor-crc"] Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.410786 4812 kubelet.go:2673] "Unable to find pod for mirror pod, skipping" mirrorPod="openshift-kube-apiserver/kube-apiserver-startup-monitor-crc" mirrorPodUID="2efddcee-456b-4227-9c8d-5b359876e408" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.419984 4812 scope.go:117] "RemoveContainer" containerID="d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94" Jan 31 04:31:32 crc kubenswrapper[4812]: E0131 04:31:32.420387 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94\": container with ID starting with d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94 not found: ID does not exist" containerID="d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94" Jan 31 04:31:32 crc kubenswrapper[4812]: I0131 04:31:32.420538 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94"} err="failed to get container status \"d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94\": rpc error: code = NotFound desc = could not find container \"d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94\": container with ID starting with d71b817509d82117480d9ce0652344e6cf73086a931b70ab962e10e421f09d94 not found: ID does not exist" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.388293 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z"] Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.388789 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" podUID="9b59154e-f9e6-475d-9fc7-77af2533a402" containerName="controller-manager" containerID="cri-o://3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55" gracePeriod=30 Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.485448 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x"] Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.485696 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" podUID="f87f95aa-2fc9-4c0d-982c-e37190ae86cc" containerName="route-controller-manager" containerID="cri-o://7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5" gracePeriod=30 Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.780455 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.837508 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.929982 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-client-ca\") pod \"9b59154e-f9e6-475d-9fc7-77af2533a402\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930094 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rnj72\" (UniqueName: \"kubernetes.io/projected/9b59154e-f9e6-475d-9fc7-77af2533a402-kube-api-access-rnj72\") pod \"9b59154e-f9e6-475d-9fc7-77af2533a402\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930136 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b59154e-f9e6-475d-9fc7-77af2533a402-serving-cert\") pod \"9b59154e-f9e6-475d-9fc7-77af2533a402\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930189 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-config\") pod \"9b59154e-f9e6-475d-9fc7-77af2533a402\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930252 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-proxy-ca-bundles\") pod \"9b59154e-f9e6-475d-9fc7-77af2533a402\" (UID: \"9b59154e-f9e6-475d-9fc7-77af2533a402\") " Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930618 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-client-ca" (OuterVolumeSpecName: "client-ca") pod "9b59154e-f9e6-475d-9fc7-77af2533a402" (UID: "9b59154e-f9e6-475d-9fc7-77af2533a402"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930693 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "9b59154e-f9e6-475d-9fc7-77af2533a402" (UID: "9b59154e-f9e6-475d-9fc7-77af2533a402"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.930744 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-config" (OuterVolumeSpecName: "config") pod "9b59154e-f9e6-475d-9fc7-77af2533a402" (UID: "9b59154e-f9e6-475d-9fc7-77af2533a402"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.934604 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9b59154e-f9e6-475d-9fc7-77af2533a402-kube-api-access-rnj72" (OuterVolumeSpecName: "kube-api-access-rnj72") pod "9b59154e-f9e6-475d-9fc7-77af2533a402" (UID: "9b59154e-f9e6-475d-9fc7-77af2533a402"). InnerVolumeSpecName "kube-api-access-rnj72". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:31:37 crc kubenswrapper[4812]: I0131 04:31:37.935369 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9b59154e-f9e6-475d-9fc7-77af2533a402-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "9b59154e-f9e6-475d-9fc7-77af2533a402" (UID: "9b59154e-f9e6-475d-9fc7-77af2533a402"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.031686 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-client-ca\") pod \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.031735 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-serving-cert\") pod \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.031771 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d6ctw\" (UniqueName: \"kubernetes.io/projected/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-kube-api-access-d6ctw\") pod \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.031798 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-config\") pod \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\" (UID: \"f87f95aa-2fc9-4c0d-982c-e37190ae86cc\") " Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032032 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032049 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032058 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rnj72\" (UniqueName: \"kubernetes.io/projected/9b59154e-f9e6-475d-9fc7-77af2533a402-kube-api-access-rnj72\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032067 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/9b59154e-f9e6-475d-9fc7-77af2533a402-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032077 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/9b59154e-f9e6-475d-9fc7-77af2533a402-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032576 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-config" (OuterVolumeSpecName: "config") pod "f87f95aa-2fc9-4c0d-982c-e37190ae86cc" (UID: "f87f95aa-2fc9-4c0d-982c-e37190ae86cc"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.032742 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-client-ca" (OuterVolumeSpecName: "client-ca") pod "f87f95aa-2fc9-4c0d-982c-e37190ae86cc" (UID: "f87f95aa-2fc9-4c0d-982c-e37190ae86cc"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.036753 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-kube-api-access-d6ctw" (OuterVolumeSpecName: "kube-api-access-d6ctw") pod "f87f95aa-2fc9-4c0d-982c-e37190ae86cc" (UID: "f87f95aa-2fc9-4c0d-982c-e37190ae86cc"). InnerVolumeSpecName "kube-api-access-d6ctw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.036788 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "f87f95aa-2fc9-4c0d-982c-e37190ae86cc" (UID: "f87f95aa-2fc9-4c0d-982c-e37190ae86cc"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.133449 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.133506 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.133531 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d6ctw\" (UniqueName: \"kubernetes.io/projected/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-kube-api-access-d6ctw\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.133556 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/f87f95aa-2fc9-4c0d-982c-e37190ae86cc-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.396938 4812 generic.go:334] "Generic (PLEG): container finished" podID="f87f95aa-2fc9-4c0d-982c-e37190ae86cc" containerID="7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5" exitCode=0 Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.397033 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" event={"ID":"f87f95aa-2fc9-4c0d-982c-e37190ae86cc","Type":"ContainerDied","Data":"7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5"} Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.397061 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" event={"ID":"f87f95aa-2fc9-4c0d-982c-e37190ae86cc","Type":"ContainerDied","Data":"79147bd82f97aa612f434c5631dbe2b8c4e8662f15e8a60a965f58ad7c311787"} Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.397032 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.397101 4812 scope.go:117] "RemoveContainer" containerID="7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.399369 4812 generic.go:334] "Generic (PLEG): container finished" podID="9b59154e-f9e6-475d-9fc7-77af2533a402" containerID="3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55" exitCode=0 Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.399419 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" event={"ID":"9b59154e-f9e6-475d-9fc7-77af2533a402","Type":"ContainerDied","Data":"3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55"} Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.399458 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" event={"ID":"9b59154e-f9e6-475d-9fc7-77af2533a402","Type":"ContainerDied","Data":"62b49f3ca5b10fa341cee379cbc55cd6d1dc4cb2e1ff4f290b08681276196184"} Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.399520 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.415384 4812 scope.go:117] "RemoveContainer" containerID="7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5" Jan 31 04:31:38 crc kubenswrapper[4812]: E0131 04:31:38.422282 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5\": container with ID starting with 7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5 not found: ID does not exist" containerID="7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.422350 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5"} err="failed to get container status \"7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5\": rpc error: code = NotFound desc = could not find container \"7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5\": container with ID starting with 7169ac76e046849e08c1fe984721e7f057f655c63a7b39a4be01fcd70224e7b5 not found: ID does not exist" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.422390 4812 scope.go:117] "RemoveContainer" containerID="3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.426587 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x"] Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.437874 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-c59479947-k6l6x"] Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.445128 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z"] Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.450087 4812 scope.go:117] "RemoveContainer" containerID="3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55" Jan 31 04:31:38 crc kubenswrapper[4812]: E0131 04:31:38.451019 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55\": container with ID starting with 3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55 not found: ID does not exist" containerID="3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.451072 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55"} err="failed to get container status \"3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55\": rpc error: code = NotFound desc = could not find container \"3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55\": container with ID starting with 3079b2a7a49a25c75f01460b449f1e42c29d79adc2f4234854ad673edce5ef55 not found: ID does not exist" Jan 31 04:31:38 crc kubenswrapper[4812]: I0131 04:31:38.451358 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-69cf84b9dc-tkf9z"] Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.015893 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-kkmvs"] Jan 31 04:31:39 crc kubenswrapper[4812]: E0131 04:31:39.016202 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f87f95aa-2fc9-4c0d-982c-e37190ae86cc" containerName="route-controller-manager" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.016223 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f87f95aa-2fc9-4c0d-982c-e37190ae86cc" containerName="route-controller-manager" Jan 31 04:31:39 crc kubenswrapper[4812]: E0131 04:31:39.016252 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.016268 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 04:31:39 crc kubenswrapper[4812]: E0131 04:31:39.016282 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9b59154e-f9e6-475d-9fc7-77af2533a402" containerName="controller-manager" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.016296 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9b59154e-f9e6-475d-9fc7-77af2533a402" containerName="controller-manager" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.016452 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9b59154e-f9e6-475d-9fc7-77af2533a402" containerName="controller-manager" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.016471 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f87f95aa-2fc9-4c0d-982c-e37190ae86cc" containerName="route-controller-manager" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.016491 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="f85e55b1a89d02b0cb034b1ea31ed45a" containerName="startup-monitor" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.017059 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.019747 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.020924 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.022665 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.026560 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.026963 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.034753 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.035042 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.035082 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4"] Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.036181 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.045632 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.046180 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.046471 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.046907 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.047173 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.047396 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-kkmvs"] Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.061298 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.076575 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4"] Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164450 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-serving-cert\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164496 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-config\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164514 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-config\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164532 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-serving-cert\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164549 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-client-ca\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164574 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-client-ca\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164726 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-proxy-ca-bundles\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164869 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9v8nr\" (UniqueName: \"kubernetes.io/projected/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-kube-api-access-9v8nr\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.164919 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qcfqm\" (UniqueName: \"kubernetes.io/projected/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-kube-api-access-qcfqm\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266541 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-client-ca\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266613 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-proxy-ca-bundles\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266685 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9v8nr\" (UniqueName: \"kubernetes.io/projected/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-kube-api-access-9v8nr\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266726 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qcfqm\" (UniqueName: \"kubernetes.io/projected/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-kube-api-access-qcfqm\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266805 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-serving-cert\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266862 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-config\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266895 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-config\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266928 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-serving-cert\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.266961 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-client-ca\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.267650 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-client-ca\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.268299 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-client-ca\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.268879 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-config\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.268914 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-config\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.269107 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-proxy-ca-bundles\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.272630 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-serving-cert\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.274387 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-serving-cert\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.294153 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qcfqm\" (UniqueName: \"kubernetes.io/projected/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-kube-api-access-qcfqm\") pod \"controller-manager-5fd44d7946-kkmvs\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.294929 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9v8nr\" (UniqueName: \"kubernetes.io/projected/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-kube-api-access-9v8nr\") pod \"route-controller-manager-8448bc74bf-5grq4\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.364980 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.383515 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.630672 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4"] Jan 31 04:31:39 crc kubenswrapper[4812]: I0131 04:31:39.801752 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-kkmvs"] Jan 31 04:31:39 crc kubenswrapper[4812]: W0131 04:31:39.811324 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6409a043_5c05_48d6_96e0_c9fb1f05d3c2.slice/crio-81d3fe7c5628a12c57f939262ee24a0a75d91b5f87dfd7ede33257a7bbd00764 WatchSource:0}: Error finding container 81d3fe7c5628a12c57f939262ee24a0a75d91b5f87dfd7ede33257a7bbd00764: Status 404 returned error can't find the container with id 81d3fe7c5628a12c57f939262ee24a0a75d91b5f87dfd7ede33257a7bbd00764 Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.345035 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9b59154e-f9e6-475d-9fc7-77af2533a402" path="/var/lib/kubelet/pods/9b59154e-f9e6-475d-9fc7-77af2533a402/volumes" Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.345684 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f87f95aa-2fc9-4c0d-982c-e37190ae86cc" path="/var/lib/kubelet/pods/f87f95aa-2fc9-4c0d-982c-e37190ae86cc/volumes" Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.415954 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" event={"ID":"6409a043-5c05-48d6-96e0-c9fb1f05d3c2","Type":"ContainerStarted","Data":"0a9550c4960604e4001eb77f1c416eaf27839a07224a829579cbb0e9a8fc6272"} Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.415999 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" event={"ID":"6409a043-5c05-48d6-96e0-c9fb1f05d3c2","Type":"ContainerStarted","Data":"81d3fe7c5628a12c57f939262ee24a0a75d91b5f87dfd7ede33257a7bbd00764"} Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.416346 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.417969 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" event={"ID":"6e49100e-72fc-4c6f-8aa0-77ef7db40c41","Type":"ContainerStarted","Data":"32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01"} Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.418023 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" event={"ID":"6e49100e-72fc-4c6f-8aa0-77ef7db40c41","Type":"ContainerStarted","Data":"9f65edddd024d0752513e5c53859b0a5fa3a80d4bbfa12299a60b604cb4c8437"} Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.420131 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.436024 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" podStartSLOduration=3.436012279 podStartE2EDuration="3.436012279s" podCreationTimestamp="2026-01-31 04:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:31:40.43422424 +0000 UTC m=+308.929245905" watchObservedRunningTime="2026-01-31 04:31:40.436012279 +0000 UTC m=+308.931033944" Jan 31 04:31:40 crc kubenswrapper[4812]: I0131 04:31:40.482151 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" podStartSLOduration=3.482129891 podStartE2EDuration="3.482129891s" podCreationTimestamp="2026-01-31 04:31:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:31:40.482004607 +0000 UTC m=+308.977026272" watchObservedRunningTime="2026-01-31 04:31:40.482129891 +0000 UTC m=+308.977151556" Jan 31 04:31:41 crc kubenswrapper[4812]: I0131 04:31:41.428043 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:41 crc kubenswrapper[4812]: I0131 04:31:41.436062 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:43 crc kubenswrapper[4812]: I0131 04:31:43.981586 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-apiserver-operator"/"openshift-apiserver-operator-config" Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.391857 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-kkmvs"] Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.392421 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" podUID="6409a043-5c05-48d6-96e0-c9fb1f05d3c2" containerName="controller-manager" containerID="cri-o://0a9550c4960604e4001eb77f1c416eaf27839a07224a829579cbb0e9a8fc6272" gracePeriod=30 Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.423034 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4"] Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.423300 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" podUID="6e49100e-72fc-4c6f-8aa0-77ef7db40c41" containerName="route-controller-manager" containerID="cri-o://32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01" gracePeriod=30 Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.538785 4812 generic.go:334] "Generic (PLEG): container finished" podID="6409a043-5c05-48d6-96e0-c9fb1f05d3c2" containerID="0a9550c4960604e4001eb77f1c416eaf27839a07224a829579cbb0e9a8fc6272" exitCode=0 Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.538852 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" event={"ID":"6409a043-5c05-48d6-96e0-c9fb1f05d3c2","Type":"ContainerDied","Data":"0a9550c4960604e4001eb77f1c416eaf27839a07224a829579cbb0e9a8fc6272"} Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.964577 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:57 crc kubenswrapper[4812]: I0131 04:31:57.972057 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163071 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-proxy-ca-bundles\") pod \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163135 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-client-ca\") pod \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163212 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-serving-cert\") pod \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163279 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9v8nr\" (UniqueName: \"kubernetes.io/projected/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-kube-api-access-9v8nr\") pod \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163332 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qcfqm\" (UniqueName: \"kubernetes.io/projected/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-kube-api-access-qcfqm\") pod \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163400 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-config\") pod \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163434 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-client-ca\") pod \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163511 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-config\") pod \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\" (UID: \"6e49100e-72fc-4c6f-8aa0-77ef7db40c41\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.163542 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-serving-cert\") pod \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\" (UID: \"6409a043-5c05-48d6-96e0-c9fb1f05d3c2\") " Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.164535 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-client-ca" (OuterVolumeSpecName: "client-ca") pod "6409a043-5c05-48d6-96e0-c9fb1f05d3c2" (UID: "6409a043-5c05-48d6-96e0-c9fb1f05d3c2"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.164711 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-config" (OuterVolumeSpecName: "config") pod "6409a043-5c05-48d6-96e0-c9fb1f05d3c2" (UID: "6409a043-5c05-48d6-96e0-c9fb1f05d3c2"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.166186 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-client-ca" (OuterVolumeSpecName: "client-ca") pod "6e49100e-72fc-4c6f-8aa0-77ef7db40c41" (UID: "6e49100e-72fc-4c6f-8aa0-77ef7db40c41"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.166371 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "6409a043-5c05-48d6-96e0-c9fb1f05d3c2" (UID: "6409a043-5c05-48d6-96e0-c9fb1f05d3c2"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.166470 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-config" (OuterVolumeSpecName: "config") pod "6e49100e-72fc-4c6f-8aa0-77ef7db40c41" (UID: "6e49100e-72fc-4c6f-8aa0-77ef7db40c41"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.171053 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-kube-api-access-qcfqm" (OuterVolumeSpecName: "kube-api-access-qcfqm") pod "6409a043-5c05-48d6-96e0-c9fb1f05d3c2" (UID: "6409a043-5c05-48d6-96e0-c9fb1f05d3c2"). InnerVolumeSpecName "kube-api-access-qcfqm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.171149 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6409a043-5c05-48d6-96e0-c9fb1f05d3c2" (UID: "6409a043-5c05-48d6-96e0-c9fb1f05d3c2"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.171202 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "6e49100e-72fc-4c6f-8aa0-77ef7db40c41" (UID: "6e49100e-72fc-4c6f-8aa0-77ef7db40c41"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.171368 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-kube-api-access-9v8nr" (OuterVolumeSpecName: "kube-api-access-9v8nr") pod "6e49100e-72fc-4c6f-8aa0-77ef7db40c41" (UID: "6e49100e-72fc-4c6f-8aa0-77ef7db40c41"). InnerVolumeSpecName "kube-api-access-9v8nr". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265405 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9v8nr\" (UniqueName: \"kubernetes.io/projected/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-kube-api-access-9v8nr\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265474 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qcfqm\" (UniqueName: \"kubernetes.io/projected/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-kube-api-access-qcfqm\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265501 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265524 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265548 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265575 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265592 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265608 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/6409a043-5c05-48d6-96e0-c9fb1f05d3c2-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.265626 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/6e49100e-72fc-4c6f-8aa0-77ef7db40c41-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.551472 4812 generic.go:334] "Generic (PLEG): container finished" podID="6e49100e-72fc-4c6f-8aa0-77ef7db40c41" containerID="32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01" exitCode=0 Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.551576 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" event={"ID":"6e49100e-72fc-4c6f-8aa0-77ef7db40c41","Type":"ContainerDied","Data":"32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01"} Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.551616 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" event={"ID":"6e49100e-72fc-4c6f-8aa0-77ef7db40c41","Type":"ContainerDied","Data":"9f65edddd024d0752513e5c53859b0a5fa3a80d4bbfa12299a60b604cb4c8437"} Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.551644 4812 scope.go:117] "RemoveContainer" containerID="32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.551818 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.556448 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" event={"ID":"6409a043-5c05-48d6-96e0-c9fb1f05d3c2","Type":"ContainerDied","Data":"81d3fe7c5628a12c57f939262ee24a0a75d91b5f87dfd7ede33257a7bbd00764"} Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.556547 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fd44d7946-kkmvs" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.590100 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4"] Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.592904 4812 scope.go:117] "RemoveContainer" containerID="32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01" Jan 31 04:31:58 crc kubenswrapper[4812]: E0131 04:31:58.593675 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01\": container with ID starting with 32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01 not found: ID does not exist" containerID="32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.593733 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01"} err="failed to get container status \"32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01\": rpc error: code = NotFound desc = could not find container \"32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01\": container with ID starting with 32fc52501414b1565ef886545d370cb2fc9ab19bfa189f306290669eeadcbe01 not found: ID does not exist" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.593776 4812 scope.go:117] "RemoveContainer" containerID="0a9550c4960604e4001eb77f1c416eaf27839a07224a829579cbb0e9a8fc6272" Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.601518 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-5grq4"] Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.608441 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-kkmvs"] Jan 31 04:31:58 crc kubenswrapper[4812]: I0131 04:31:58.613512 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-kkmvs"] Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.029799 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm"] Jan 31 04:31:59 crc kubenswrapper[4812]: E0131 04:31:59.030133 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6e49100e-72fc-4c6f-8aa0-77ef7db40c41" containerName="route-controller-manager" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.030154 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="6e49100e-72fc-4c6f-8aa0-77ef7db40c41" containerName="route-controller-manager" Jan 31 04:31:59 crc kubenswrapper[4812]: E0131 04:31:59.030181 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6409a043-5c05-48d6-96e0-c9fb1f05d3c2" containerName="controller-manager" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.030194 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="6409a043-5c05-48d6-96e0-c9fb1f05d3c2" containerName="controller-manager" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.030625 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="6409a043-5c05-48d6-96e0-c9fb1f05d3c2" containerName="controller-manager" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.030669 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="6e49100e-72fc-4c6f-8aa0-77ef7db40c41" containerName="route-controller-manager" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.031474 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.035923 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.036343 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.036589 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.036927 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.037223 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.037438 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.045317 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-6796c7fd86-fl447"] Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.046406 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.048989 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.049580 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.053378 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.053947 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.053983 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.054256 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.054448 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6796c7fd86-fl447"] Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.067989 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm"] Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.080273 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.178931 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96aeb59a-f6e9-47a5-9b74-4c958754bd52-serving-cert\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179140 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417ab395-407a-47ac-a624-a9a3e801ae35-serving-cert\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179213 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-config\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179286 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hh8tx\" (UniqueName: \"kubernetes.io/projected/417ab395-407a-47ac-a624-a9a3e801ae35-kube-api-access-hh8tx\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179400 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndph6\" (UniqueName: \"kubernetes.io/projected/96aeb59a-f6e9-47a5-9b74-4c958754bd52-kube-api-access-ndph6\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179447 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-client-ca\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179513 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-client-ca\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179571 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-config\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.179603 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-proxy-ca-bundles\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280324 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417ab395-407a-47ac-a624-a9a3e801ae35-serving-cert\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280381 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-config\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280414 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hh8tx\" (UniqueName: \"kubernetes.io/projected/417ab395-407a-47ac-a624-a9a3e801ae35-kube-api-access-hh8tx\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280459 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ndph6\" (UniqueName: \"kubernetes.io/projected/96aeb59a-f6e9-47a5-9b74-4c958754bd52-kube-api-access-ndph6\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280486 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-client-ca\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280526 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-client-ca\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280551 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-config\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280570 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-proxy-ca-bundles\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.280599 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96aeb59a-f6e9-47a5-9b74-4c958754bd52-serving-cert\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.282356 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-client-ca\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.282361 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-client-ca\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.282791 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-config\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.283264 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-config\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.283639 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-proxy-ca-bundles\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.288662 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96aeb59a-f6e9-47a5-9b74-4c958754bd52-serving-cert\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.289094 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417ab395-407a-47ac-a624-a9a3e801ae35-serving-cert\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.301925 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hh8tx\" (UniqueName: \"kubernetes.io/projected/417ab395-407a-47ac-a624-a9a3e801ae35-kube-api-access-hh8tx\") pod \"controller-manager-6796c7fd86-fl447\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.316308 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndph6\" (UniqueName: \"kubernetes.io/projected/96aeb59a-f6e9-47a5-9b74-4c958754bd52-kube-api-access-ndph6\") pod \"route-controller-manager-66b4db8ff5-6dsjm\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.350354 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.381316 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.627455 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm"] Jan 31 04:31:59 crc kubenswrapper[4812]: W0131 04:31:59.642963 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod96aeb59a_f6e9_47a5_9b74_4c958754bd52.slice/crio-b028b563b577574e5268b478535d6a8b3a1e0bce8891f4193b800eb5c699f8e3 WatchSource:0}: Error finding container b028b563b577574e5268b478535d6a8b3a1e0bce8891f4193b800eb5c699f8e3: Status 404 returned error can't find the container with id b028b563b577574e5268b478535d6a8b3a1e0bce8891f4193b800eb5c699f8e3 Jan 31 04:31:59 crc kubenswrapper[4812]: I0131 04:31:59.725226 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-6796c7fd86-fl447"] Jan 31 04:31:59 crc kubenswrapper[4812]: W0131 04:31:59.727262 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod417ab395_407a_47ac_a624_a9a3e801ae35.slice/crio-60144b7cd075aa4d693691ec1106bcd9d1af9c421cc198467f11a6294f89e904 WatchSource:0}: Error finding container 60144b7cd075aa4d693691ec1106bcd9d1af9c421cc198467f11a6294f89e904: Status 404 returned error can't find the container with id 60144b7cd075aa4d693691ec1106bcd9d1af9c421cc198467f11a6294f89e904 Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.345454 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6409a043-5c05-48d6-96e0-c9fb1f05d3c2" path="/var/lib/kubelet/pods/6409a043-5c05-48d6-96e0-c9fb1f05d3c2/volumes" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.346131 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6e49100e-72fc-4c6f-8aa0-77ef7db40c41" path="/var/lib/kubelet/pods/6e49100e-72fc-4c6f-8aa0-77ef7db40c41/volumes" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.576134 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" event={"ID":"96aeb59a-f6e9-47a5-9b74-4c958754bd52","Type":"ContainerStarted","Data":"7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a"} Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.576184 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" event={"ID":"96aeb59a-f6e9-47a5-9b74-4c958754bd52","Type":"ContainerStarted","Data":"b028b563b577574e5268b478535d6a8b3a1e0bce8891f4193b800eb5c699f8e3"} Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.576398 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.577632 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" event={"ID":"417ab395-407a-47ac-a624-a9a3e801ae35","Type":"ContainerStarted","Data":"4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab"} Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.577674 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" event={"ID":"417ab395-407a-47ac-a624-a9a3e801ae35","Type":"ContainerStarted","Data":"60144b7cd075aa4d693691ec1106bcd9d1af9c421cc198467f11a6294f89e904"} Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.578035 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.584779 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.585323 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.601937 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" podStartSLOduration=3.601921184 podStartE2EDuration="3.601921184s" podCreationTimestamp="2026-01-31 04:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:32:00.599499846 +0000 UTC m=+329.094521551" watchObservedRunningTime="2026-01-31 04:32:00.601921184 +0000 UTC m=+329.096942849" Jan 31 04:32:00 crc kubenswrapper[4812]: I0131 04:32:00.619607 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" podStartSLOduration=3.619588282 podStartE2EDuration="3.619588282s" podCreationTimestamp="2026-01-31 04:31:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:32:00.615795517 +0000 UTC m=+329.110817232" watchObservedRunningTime="2026-01-31 04:32:00.619588282 +0000 UTC m=+329.114609947" Jan 31 04:32:44 crc kubenswrapper[4812]: I0131 04:32:44.338430 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:32:44 crc kubenswrapper[4812]: I0131 04:32:44.339083 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.445347 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8mmpn"] Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.447204 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.479412 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8mmpn"] Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.602426 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97182ddb-92e1-4a39-8097-57ef56edc9a4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.602520 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-registry-tls\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.602605 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-bound-sa-token\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.602669 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97182ddb-92e1-4a39-8097-57ef56edc9a4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.602715 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97182ddb-92e1-4a39-8097-57ef56edc9a4-trusted-ca\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.602943 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.603019 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pwpd9\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-kube-api-access-pwpd9\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.603100 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97182ddb-92e1-4a39-8097-57ef56edc9a4-registry-certificates\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.628882 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704532 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97182ddb-92e1-4a39-8097-57ef56edc9a4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704580 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97182ddb-92e1-4a39-8097-57ef56edc9a4-trusted-ca\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704618 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pwpd9\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-kube-api-access-pwpd9\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704648 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97182ddb-92e1-4a39-8097-57ef56edc9a4-registry-certificates\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704674 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97182ddb-92e1-4a39-8097-57ef56edc9a4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704697 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-registry-tls\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.704713 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-bound-sa-token\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.706219 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/97182ddb-92e1-4a39-8097-57ef56edc9a4-ca-trust-extracted\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.706393 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/97182ddb-92e1-4a39-8097-57ef56edc9a4-registry-certificates\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.706649 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/97182ddb-92e1-4a39-8097-57ef56edc9a4-trusted-ca\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.716677 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/97182ddb-92e1-4a39-8097-57ef56edc9a4-installation-pull-secrets\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.717091 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-registry-tls\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.722799 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pwpd9\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-kube-api-access-pwpd9\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.725304 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/97182ddb-92e1-4a39-8097-57ef56edc9a4-bound-sa-token\") pod \"image-registry-66df7c8f76-8mmpn\" (UID: \"97182ddb-92e1-4a39-8097-57ef56edc9a4\") " pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:50 crc kubenswrapper[4812]: I0131 04:32:50.789524 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:51 crc kubenswrapper[4812]: I0131 04:32:51.323784 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-66df7c8f76-8mmpn"] Jan 31 04:32:51 crc kubenswrapper[4812]: I0131 04:32:51.946940 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" event={"ID":"97182ddb-92e1-4a39-8097-57ef56edc9a4","Type":"ContainerStarted","Data":"39b0efb4e56c9c108e80e9dd881444a5286ab83f034249f0f5eda377d272d189"} Jan 31 04:32:51 crc kubenswrapper[4812]: I0131 04:32:51.947267 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" event={"ID":"97182ddb-92e1-4a39-8097-57ef56edc9a4","Type":"ContainerStarted","Data":"2c6f6a473bcf182ef4216f4b28de19ca71c82a1fdd29e7cba75621aaddc3d889"} Jan 31 04:32:51 crc kubenswrapper[4812]: I0131 04:32:51.950284 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.399338 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" podStartSLOduration=7.399318655 podStartE2EDuration="7.399318655s" podCreationTimestamp="2026-01-31 04:32:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:32:51.995778034 +0000 UTC m=+380.490799729" watchObservedRunningTime="2026-01-31 04:32:57.399318655 +0000 UTC m=+385.894340320" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.402299 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6796c7fd86-fl447"] Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.402600 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" podUID="417ab395-407a-47ac-a624-a9a3e801ae35" containerName="controller-manager" containerID="cri-o://4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab" gracePeriod=30 Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.436332 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm"] Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.436832 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" podUID="96aeb59a-f6e9-47a5-9b74-4c958754bd52" containerName="route-controller-manager" containerID="cri-o://7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a" gracePeriod=30 Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.789653 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.870451 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.921134 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-proxy-ca-bundles\") pod \"417ab395-407a-47ac-a624-a9a3e801ae35\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.921212 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-config\") pod \"417ab395-407a-47ac-a624-a9a3e801ae35\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.921366 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417ab395-407a-47ac-a624-a9a3e801ae35-serving-cert\") pod \"417ab395-407a-47ac-a624-a9a3e801ae35\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.922185 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-config" (OuterVolumeSpecName: "config") pod "417ab395-407a-47ac-a624-a9a3e801ae35" (UID: "417ab395-407a-47ac-a624-a9a3e801ae35"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.922210 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-proxy-ca-bundles" (OuterVolumeSpecName: "proxy-ca-bundles") pod "417ab395-407a-47ac-a624-a9a3e801ae35" (UID: "417ab395-407a-47ac-a624-a9a3e801ae35"). InnerVolumeSpecName "proxy-ca-bundles". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.922605 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-client-ca\") pod \"417ab395-407a-47ac-a624-a9a3e801ae35\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.922666 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hh8tx\" (UniqueName: \"kubernetes.io/projected/417ab395-407a-47ac-a624-a9a3e801ae35-kube-api-access-hh8tx\") pod \"417ab395-407a-47ac-a624-a9a3e801ae35\" (UID: \"417ab395-407a-47ac-a624-a9a3e801ae35\") " Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.923154 4812 reconciler_common.go:293] "Volume detached for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-proxy-ca-bundles\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.923229 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.923373 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-client-ca" (OuterVolumeSpecName: "client-ca") pod "417ab395-407a-47ac-a624-a9a3e801ae35" (UID: "417ab395-407a-47ac-a624-a9a3e801ae35"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.927613 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/417ab395-407a-47ac-a624-a9a3e801ae35-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "417ab395-407a-47ac-a624-a9a3e801ae35" (UID: "417ab395-407a-47ac-a624-a9a3e801ae35"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.928314 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/417ab395-407a-47ac-a624-a9a3e801ae35-kube-api-access-hh8tx" (OuterVolumeSpecName: "kube-api-access-hh8tx") pod "417ab395-407a-47ac-a624-a9a3e801ae35" (UID: "417ab395-407a-47ac-a624-a9a3e801ae35"). InnerVolumeSpecName "kube-api-access-hh8tx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.986222 4812 generic.go:334] "Generic (PLEG): container finished" podID="417ab395-407a-47ac-a624-a9a3e801ae35" containerID="4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab" exitCode=0 Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.986279 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" event={"ID":"417ab395-407a-47ac-a624-a9a3e801ae35","Type":"ContainerDied","Data":"4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab"} Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.986264 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.986336 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-6796c7fd86-fl447" event={"ID":"417ab395-407a-47ac-a624-a9a3e801ae35","Type":"ContainerDied","Data":"60144b7cd075aa4d693691ec1106bcd9d1af9c421cc198467f11a6294f89e904"} Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.986349 4812 scope.go:117] "RemoveContainer" containerID="4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab" Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.989761 4812 generic.go:334] "Generic (PLEG): container finished" podID="96aeb59a-f6e9-47a5-9b74-4c958754bd52" containerID="7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a" exitCode=0 Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.989804 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" event={"ID":"96aeb59a-f6e9-47a5-9b74-4c958754bd52","Type":"ContainerDied","Data":"7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a"} Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.989923 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" event={"ID":"96aeb59a-f6e9-47a5-9b74-4c958754bd52","Type":"ContainerDied","Data":"b028b563b577574e5268b478535d6a8b3a1e0bce8891f4193b800eb5c699f8e3"} Jan 31 04:32:57 crc kubenswrapper[4812]: I0131 04:32:57.989828 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.007246 4812 scope.go:117] "RemoveContainer" containerID="4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab" Jan 31 04:32:58 crc kubenswrapper[4812]: E0131 04:32:58.007796 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab\": container with ID starting with 4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab not found: ID does not exist" containerID="4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.007886 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab"} err="failed to get container status \"4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab\": rpc error: code = NotFound desc = could not find container \"4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab\": container with ID starting with 4102a3f321f657c424d302133cc74ffc4441fd012dedd8e19419b88a3d0280ab not found: ID does not exist" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.007928 4812 scope.go:117] "RemoveContainer" containerID="7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.023908 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96aeb59a-f6e9-47a5-9b74-4c958754bd52-serving-cert\") pod \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.024094 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-config\") pod \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.024344 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-client-ca\") pod \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.024423 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndph6\" (UniqueName: \"kubernetes.io/projected/96aeb59a-f6e9-47a5-9b74-4c958754bd52-kube-api-access-ndph6\") pod \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\" (UID: \"96aeb59a-f6e9-47a5-9b74-4c958754bd52\") " Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.025092 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/417ab395-407a-47ac-a624-a9a3e801ae35-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.025131 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/417ab395-407a-47ac-a624-a9a3e801ae35-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.025150 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hh8tx\" (UniqueName: \"kubernetes.io/projected/417ab395-407a-47ac-a624-a9a3e801ae35-kube-api-access-hh8tx\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.031168 4812 scope.go:117] "RemoveContainer" containerID="7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a" Jan 31 04:32:58 crc kubenswrapper[4812]: E0131 04:32:58.031822 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a\": container with ID starting with 7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a not found: ID does not exist" containerID="7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.031882 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-config" (OuterVolumeSpecName: "config") pod "96aeb59a-f6e9-47a5-9b74-4c958754bd52" (UID: "96aeb59a-f6e9-47a5-9b74-4c958754bd52"). InnerVolumeSpecName "config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.031953 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a"} err="failed to get container status \"7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a\": rpc error: code = NotFound desc = could not find container \"7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a\": container with ID starting with 7446f002e7e57a47e5871ad665bab771d14343b5d3826277998fa28251e73e3a not found: ID does not exist" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.032763 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-client-ca" (OuterVolumeSpecName: "client-ca") pod "96aeb59a-f6e9-47a5-9b74-4c958754bd52" (UID: "96aeb59a-f6e9-47a5-9b74-4c958754bd52"). InnerVolumeSpecName "client-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.034541 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/96aeb59a-f6e9-47a5-9b74-4c958754bd52-kube-api-access-ndph6" (OuterVolumeSpecName: "kube-api-access-ndph6") pod "96aeb59a-f6e9-47a5-9b74-4c958754bd52" (UID: "96aeb59a-f6e9-47a5-9b74-4c958754bd52"). InnerVolumeSpecName "kube-api-access-ndph6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.035603 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/96aeb59a-f6e9-47a5-9b74-4c958754bd52-serving-cert" (OuterVolumeSpecName: "serving-cert") pod "96aeb59a-f6e9-47a5-9b74-4c958754bd52" (UID: "96aeb59a-f6e9-47a5-9b74-4c958754bd52"). InnerVolumeSpecName "serving-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.041300 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-controller-manager/controller-manager-6796c7fd86-fl447"] Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.046012 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-controller-manager/controller-manager-6796c7fd86-fl447"] Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.126277 4812 reconciler_common.go:293] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.126314 4812 reconciler_common.go:293] "Volume detached for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/96aeb59a-f6e9-47a5-9b74-4c958754bd52-client-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.126327 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ndph6\" (UniqueName: \"kubernetes.io/projected/96aeb59a-f6e9-47a5-9b74-4c958754bd52-kube-api-access-ndph6\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.126338 4812 reconciler_common.go:293] "Volume detached for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/96aeb59a-f6e9-47a5-9b74-4c958754bd52-serving-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.338582 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm"] Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.347722 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="417ab395-407a-47ac-a624-a9a3e801ae35" path="/var/lib/kubelet/pods/417ab395-407a-47ac-a624-a9a3e801ae35/volumes" Jan 31 04:32:58 crc kubenswrapper[4812]: I0131 04:32:58.348640 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-66b4db8ff5-6dsjm"] Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.067905 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-jvw8t"] Jan 31 04:32:59 crc kubenswrapper[4812]: E0131 04:32:59.068297 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="96aeb59a-f6e9-47a5-9b74-4c958754bd52" containerName="route-controller-manager" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.068325 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="96aeb59a-f6e9-47a5-9b74-4c958754bd52" containerName="route-controller-manager" Jan 31 04:32:59 crc kubenswrapper[4812]: E0131 04:32:59.068369 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="417ab395-407a-47ac-a624-a9a3e801ae35" containerName="controller-manager" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.068386 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="417ab395-407a-47ac-a624-a9a3e801ae35" containerName="controller-manager" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.068580 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="96aeb59a-f6e9-47a5-9b74-4c958754bd52" containerName="route-controller-manager" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.068620 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="417ab395-407a-47ac-a624-a9a3e801ae35" containerName="controller-manager" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.069236 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.075427 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"config" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.076238 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"client-ca" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.076375 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt"] Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.077469 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.083212 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-service-ca.crt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.083429 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"kube-root-ca.crt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.088276 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"serving-cert" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.088573 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"openshift-controller-manager-sa-dockercfg-msq4c" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.088801 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"config" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.089169 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-controller-manager"/"serving-cert" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.092927 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"openshift-service-ca.crt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.097938 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"client-ca" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.098392 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-route-controller-manager"/"kube-root-ca.crt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.109597 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-route-controller-manager"/"route-controller-manager-sa-dockercfg-h2zr2" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.111630 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-jvw8t"] Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.118784 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-controller-manager"/"openshift-global-ca" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.129122 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt"] Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.140690 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faefba43-23cb-4bb3-8d24-244ebfbcec0f-serving-cert\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.140806 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-proxy-ca-bundles\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.140890 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d4kpw\" (UniqueName: \"kubernetes.io/projected/faefba43-23cb-4bb3-8d24-244ebfbcec0f-kube-api-access-d4kpw\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.140938 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-client-ca\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.141343 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-config\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.242972 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/043ddf03-3d1c-42e9-b1d7-caab98f7564f-serving-cert\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243059 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-config\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243118 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/043ddf03-3d1c-42e9-b1d7-caab98f7564f-config\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243277 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-82zj6\" (UniqueName: \"kubernetes.io/projected/043ddf03-3d1c-42e9-b1d7-caab98f7564f-kube-api-access-82zj6\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243384 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/043ddf03-3d1c-42e9-b1d7-caab98f7564f-client-ca\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243485 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faefba43-23cb-4bb3-8d24-244ebfbcec0f-serving-cert\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243572 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-proxy-ca-bundles\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243607 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-client-ca\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.243628 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d4kpw\" (UniqueName: \"kubernetes.io/projected/faefba43-23cb-4bb3-8d24-244ebfbcec0f-kube-api-access-d4kpw\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.244922 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-config\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.245251 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-client-ca\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.245484 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"proxy-ca-bundles\" (UniqueName: \"kubernetes.io/configmap/faefba43-23cb-4bb3-8d24-244ebfbcec0f-proxy-ca-bundles\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.252163 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/faefba43-23cb-4bb3-8d24-244ebfbcec0f-serving-cert\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.274278 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d4kpw\" (UniqueName: \"kubernetes.io/projected/faefba43-23cb-4bb3-8d24-244ebfbcec0f-kube-api-access-d4kpw\") pod \"controller-manager-5fd44d7946-jvw8t\" (UID: \"faefba43-23cb-4bb3-8d24-244ebfbcec0f\") " pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.345139 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/043ddf03-3d1c-42e9-b1d7-caab98f7564f-config\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.345187 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-82zj6\" (UniqueName: \"kubernetes.io/projected/043ddf03-3d1c-42e9-b1d7-caab98f7564f-kube-api-access-82zj6\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.345207 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/043ddf03-3d1c-42e9-b1d7-caab98f7564f-client-ca\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.345253 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/043ddf03-3d1c-42e9-b1d7-caab98f7564f-serving-cert\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.346992 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"client-ca\" (UniqueName: \"kubernetes.io/configmap/043ddf03-3d1c-42e9-b1d7-caab98f7564f-client-ca\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.347774 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/043ddf03-3d1c-42e9-b1d7-caab98f7564f-config\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.352483 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/043ddf03-3d1c-42e9-b1d7-caab98f7564f-serving-cert\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.376165 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-82zj6\" (UniqueName: \"kubernetes.io/projected/043ddf03-3d1c-42e9-b1d7-caab98f7564f-kube-api-access-82zj6\") pod \"route-controller-manager-8448bc74bf-t9dzt\" (UID: \"043ddf03-3d1c-42e9-b1d7-caab98f7564f\") " pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.416894 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.450334 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.876175 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-controller-manager/controller-manager-5fd44d7946-jvw8t"] Jan 31 04:32:59 crc kubenswrapper[4812]: I0131 04:32:59.962805 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt"] Jan 31 04:32:59 crc kubenswrapper[4812]: W0131 04:32:59.971972 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod043ddf03_3d1c_42e9_b1d7_caab98f7564f.slice/crio-b69a6af46058357c3e2badc9cba7ed23cadc310417b500a544a5fceb27126087 WatchSource:0}: Error finding container b69a6af46058357c3e2badc9cba7ed23cadc310417b500a544a5fceb27126087: Status 404 returned error can't find the container with id b69a6af46058357c3e2badc9cba7ed23cadc310417b500a544a5fceb27126087 Jan 31 04:33:00 crc kubenswrapper[4812]: I0131 04:33:00.008475 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" event={"ID":"faefba43-23cb-4bb3-8d24-244ebfbcec0f","Type":"ContainerStarted","Data":"ecb709fe88aaccd72f15bdc64afc0a0476c75d40a872d6410f3e913e860d3ddf"} Jan 31 04:33:00 crc kubenswrapper[4812]: I0131 04:33:00.013698 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" event={"ID":"043ddf03-3d1c-42e9-b1d7-caab98f7564f","Type":"ContainerStarted","Data":"b69a6af46058357c3e2badc9cba7ed23cadc310417b500a544a5fceb27126087"} Jan 31 04:33:00 crc kubenswrapper[4812]: I0131 04:33:00.347527 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="96aeb59a-f6e9-47a5-9b74-4c958754bd52" path="/var/lib/kubelet/pods/96aeb59a-f6e9-47a5-9b74-4c958754bd52/volumes" Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.023761 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" event={"ID":"043ddf03-3d1c-42e9-b1d7-caab98f7564f","Type":"ContainerStarted","Data":"381b82c0c6ad4628310cbc0952205081f866dd05166cab78b1129795f8dfccf3"} Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.023996 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.026202 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" event={"ID":"faefba43-23cb-4bb3-8d24-244ebfbcec0f","Type":"ContainerStarted","Data":"8fb7fbfcd23f029a1d1fce017c97196877c61faf90fa4ee6ad9ebb4752c29f45"} Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.026456 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.030805 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.032669 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.049935 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-route-controller-manager/route-controller-manager-8448bc74bf-t9dzt" podStartSLOduration=4.049910534 podStartE2EDuration="4.049910534s" podCreationTimestamp="2026-01-31 04:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:33:01.045614566 +0000 UTC m=+389.540636231" watchObservedRunningTime="2026-01-31 04:33:01.049910534 +0000 UTC m=+389.544932239" Jan 31 04:33:01 crc kubenswrapper[4812]: I0131 04:33:01.079357 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-controller-manager/controller-manager-5fd44d7946-jvw8t" podStartSLOduration=4.0793307070000004 podStartE2EDuration="4.079330707s" podCreationTimestamp="2026-01-31 04:32:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:33:01.070922785 +0000 UTC m=+389.565944450" watchObservedRunningTime="2026-01-31 04:33:01.079330707 +0000 UTC m=+389.574352402" Jan 31 04:33:10 crc kubenswrapper[4812]: I0131 04:33:10.802470 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-66df7c8f76-8mmpn" Jan 31 04:33:10 crc kubenswrapper[4812]: I0131 04:33:10.867457 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5c747"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.048925 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sj99w"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.049818 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-sj99w" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="registry-server" containerID="cri-o://a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5" gracePeriod=30 Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.054837 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56x2d"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.055432 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-56x2d" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="registry-server" containerID="cri-o://517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673" gracePeriod=30 Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.073240 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wh59s"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.073443 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" containerID="cri-o://869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824" gracePeriod=30 Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.084300 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfclc"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.084600 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-bfclc" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="registry-server" containerID="cri-o://6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c" gracePeriod=30 Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.095371 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-487ln"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.095583 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-487ln" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="registry-server" containerID="cri-o://ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e" gracePeriod=30 Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.103600 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rcfw5"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.104215 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.118452 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rcfw5"] Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.270928 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55175f00-9682-4c72-a26a-3b050c99af46-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.271540 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72dx4\" (UniqueName: \"kubernetes.io/projected/55175f00-9682-4c72-a26a-3b050c99af46-kube-api-access-72dx4\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.271597 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/55175f00-9682-4c72-a26a-3b050c99af46-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.372780 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55175f00-9682-4c72-a26a-3b050c99af46-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.372843 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-72dx4\" (UniqueName: \"kubernetes.io/projected/55175f00-9682-4c72-a26a-3b050c99af46-kube-api-access-72dx4\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.372914 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/55175f00-9682-4c72-a26a-3b050c99af46-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.374541 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/55175f00-9682-4c72-a26a-3b050c99af46-marketplace-trusted-ca\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.379686 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/55175f00-9682-4c72-a26a-3b050c99af46-marketplace-operator-metrics\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.401828 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-72dx4\" (UniqueName: \"kubernetes.io/projected/55175f00-9682-4c72-a26a-3b050c99af46-kube-api-access-72dx4\") pod \"marketplace-operator-79b997595-rcfw5\" (UID: \"55175f00-9682-4c72-a26a-3b050c99af46\") " pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.488154 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.558242 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.681868 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-utilities\") pod \"a16a82f9-4289-4749-bc62-df59dacefac1\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.681948 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7tx8z\" (UniqueName: \"kubernetes.io/projected/a16a82f9-4289-4749-bc62-df59dacefac1-kube-api-access-7tx8z\") pod \"a16a82f9-4289-4749-bc62-df59dacefac1\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.681993 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-catalog-content\") pod \"a16a82f9-4289-4749-bc62-df59dacefac1\" (UID: \"a16a82f9-4289-4749-bc62-df59dacefac1\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.683316 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-utilities" (OuterVolumeSpecName: "utilities") pod "a16a82f9-4289-4749-bc62-df59dacefac1" (UID: "a16a82f9-4289-4749-bc62-df59dacefac1"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.694419 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a16a82f9-4289-4749-bc62-df59dacefac1-kube-api-access-7tx8z" (OuterVolumeSpecName: "kube-api-access-7tx8z") pod "a16a82f9-4289-4749-bc62-df59dacefac1" (UID: "a16a82f9-4289-4749-bc62-df59dacefac1"). InnerVolumeSpecName "kube-api-access-7tx8z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.772457 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "a16a82f9-4289-4749-bc62-df59dacefac1" (UID: "a16a82f9-4289-4749-bc62-df59dacefac1"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.783758 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.783780 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7tx8z\" (UniqueName: \"kubernetes.io/projected/a16a82f9-4289-4749-bc62-df59dacefac1-kube-api-access-7tx8z\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.783789 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/a16a82f9-4289-4749-bc62-df59dacefac1-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.801321 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.807447 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.823286 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.824896 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985577 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-catalog-content\") pod \"d6e79cce-8b4e-491b-a976-a3649e3566cd\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985615 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-br6rt\" (UniqueName: \"kubernetes.io/projected/39f52f71-fcee-4193-95db-158c8fe2f71f-kube-api-access-br6rt\") pod \"39f52f71-fcee-4193-95db-158c8fe2f71f\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985635 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qs4cj\" (UniqueName: \"kubernetes.io/projected/4dd847bf-95c7-48c4-9042-f078db7c8438-kube-api-access-qs4cj\") pod \"4dd847bf-95c7-48c4-9042-f078db7c8438\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985653 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-utilities\") pod \"d6e79cce-8b4e-491b-a976-a3649e3566cd\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985718 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2t89x\" (UniqueName: \"kubernetes.io/projected/d9c1a0d3-b881-4382-89c4-905ad455a360-kube-api-access-2t89x\") pod \"d9c1a0d3-b881-4382-89c4-905ad455a360\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985738 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-catalog-content\") pod \"4dd847bf-95c7-48c4-9042-f078db7c8438\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985768 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-utilities\") pod \"d9c1a0d3-b881-4382-89c4-905ad455a360\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985796 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-catalog-content\") pod \"d9c1a0d3-b881-4382-89c4-905ad455a360\" (UID: \"d9c1a0d3-b881-4382-89c4-905ad455a360\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985817 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-operator-metrics\") pod \"39f52f71-fcee-4193-95db-158c8fe2f71f\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985841 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-trusted-ca\") pod \"39f52f71-fcee-4193-95db-158c8fe2f71f\" (UID: \"39f52f71-fcee-4193-95db-158c8fe2f71f\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985873 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w8v6l\" (UniqueName: \"kubernetes.io/projected/d6e79cce-8b4e-491b-a976-a3649e3566cd-kube-api-access-w8v6l\") pod \"d6e79cce-8b4e-491b-a976-a3649e3566cd\" (UID: \"d6e79cce-8b4e-491b-a976-a3649e3566cd\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.985889 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-utilities\") pod \"4dd847bf-95c7-48c4-9042-f078db7c8438\" (UID: \"4dd847bf-95c7-48c4-9042-f078db7c8438\") " Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.986628 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-utilities" (OuterVolumeSpecName: "utilities") pod "4dd847bf-95c7-48c4-9042-f078db7c8438" (UID: "4dd847bf-95c7-48c4-9042-f078db7c8438"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.987218 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-trusted-ca" (OuterVolumeSpecName: "marketplace-trusted-ca") pod "39f52f71-fcee-4193-95db-158c8fe2f71f" (UID: "39f52f71-fcee-4193-95db-158c8fe2f71f"). InnerVolumeSpecName "marketplace-trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.987652 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-utilities" (OuterVolumeSpecName: "utilities") pod "d9c1a0d3-b881-4382-89c4-905ad455a360" (UID: "d9c1a0d3-b881-4382-89c4-905ad455a360"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.988541 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9c1a0d3-b881-4382-89c4-905ad455a360-kube-api-access-2t89x" (OuterVolumeSpecName: "kube-api-access-2t89x") pod "d9c1a0d3-b881-4382-89c4-905ad455a360" (UID: "d9c1a0d3-b881-4382-89c4-905ad455a360"). InnerVolumeSpecName "kube-api-access-2t89x". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.989277 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-operator-metrics" (OuterVolumeSpecName: "marketplace-operator-metrics") pod "39f52f71-fcee-4193-95db-158c8fe2f71f" (UID: "39f52f71-fcee-4193-95db-158c8fe2f71f"). InnerVolumeSpecName "marketplace-operator-metrics". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.989409 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d6e79cce-8b4e-491b-a976-a3649e3566cd-kube-api-access-w8v6l" (OuterVolumeSpecName: "kube-api-access-w8v6l") pod "d6e79cce-8b4e-491b-a976-a3649e3566cd" (UID: "d6e79cce-8b4e-491b-a976-a3649e3566cd"). InnerVolumeSpecName "kube-api-access-w8v6l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.989410 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/39f52f71-fcee-4193-95db-158c8fe2f71f-kube-api-access-br6rt" (OuterVolumeSpecName: "kube-api-access-br6rt") pod "39f52f71-fcee-4193-95db-158c8fe2f71f" (UID: "39f52f71-fcee-4193-95db-158c8fe2f71f"). InnerVolumeSpecName "kube-api-access-br6rt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.989761 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4dd847bf-95c7-48c4-9042-f078db7c8438-kube-api-access-qs4cj" (OuterVolumeSpecName: "kube-api-access-qs4cj") pod "4dd847bf-95c7-48c4-9042-f078db7c8438" (UID: "4dd847bf-95c7-48c4-9042-f078db7c8438"). InnerVolumeSpecName "kube-api-access-qs4cj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:12 crc kubenswrapper[4812]: I0131 04:33:12.994344 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-utilities" (OuterVolumeSpecName: "utilities") pod "d6e79cce-8b4e-491b-a976-a3649e3566cd" (UID: "d6e79cce-8b4e-491b-a976-a3649e3566cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.007064 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d6e79cce-8b4e-491b-a976-a3649e3566cd" (UID: "d6e79cce-8b4e-491b-a976-a3649e3566cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.030351 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d9c1a0d3-b881-4382-89c4-905ad455a360" (UID: "d9c1a0d3-b881-4382-89c4-905ad455a360"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.071798 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-rcfw5"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087359 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2t89x\" (UniqueName: \"kubernetes.io/projected/d9c1a0d3-b881-4382-89c4-905ad455a360-kube-api-access-2t89x\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087386 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087397 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d9c1a0d3-b881-4382-89c4-905ad455a360-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087408 4812 reconciler_common.go:293] "Volume detached for volume \"marketplace-operator-metrics\" (UniqueName: \"kubernetes.io/secret/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-operator-metrics\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087419 4812 reconciler_common.go:293] "Volume detached for volume \"marketplace-trusted-ca\" (UniqueName: \"kubernetes.io/configmap/39f52f71-fcee-4193-95db-158c8fe2f71f-marketplace-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087430 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w8v6l\" (UniqueName: \"kubernetes.io/projected/d6e79cce-8b4e-491b-a976-a3649e3566cd-kube-api-access-w8v6l\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087437 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087445 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087455 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-br6rt\" (UniqueName: \"kubernetes.io/projected/39f52f71-fcee-4193-95db-158c8fe2f71f-kube-api-access-br6rt\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087463 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qs4cj\" (UniqueName: \"kubernetes.io/projected/4dd847bf-95c7-48c4-9042-f078db7c8438-kube-api-access-qs4cj\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.087471 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d6e79cce-8b4e-491b-a976-a3649e3566cd-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.119788 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4dd847bf-95c7-48c4-9042-f078db7c8438" (UID: "4dd847bf-95c7-48c4-9042-f078db7c8438"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.120071 4812 generic.go:334] "Generic (PLEG): container finished" podID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerID="6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c" exitCode=0 Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.120135 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfclc" event={"ID":"d6e79cce-8b4e-491b-a976-a3649e3566cd","Type":"ContainerDied","Data":"6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.120155 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-bfclc" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.120174 4812 scope.go:117] "RemoveContainer" containerID="6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.120162 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-bfclc" event={"ID":"d6e79cce-8b4e-491b-a976-a3649e3566cd","Type":"ContainerDied","Data":"1b4f331baad4814be56b4c73a7c5324ad59c5d53c894ac487a2263fedc640413"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.121362 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" event={"ID":"55175f00-9682-4c72-a26a-3b050c99af46","Type":"ContainerStarted","Data":"e3cc18ba487078bcf70c96fb09f8d2afb34e8639154d0f3e10460efb9d6387fb"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.123712 4812 generic.go:334] "Generic (PLEG): container finished" podID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerID="ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e" exitCode=0 Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.123772 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487ln" event={"ID":"4dd847bf-95c7-48c4-9042-f078db7c8438","Type":"ContainerDied","Data":"ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.123792 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-487ln" event={"ID":"4dd847bf-95c7-48c4-9042-f078db7c8438","Type":"ContainerDied","Data":"313226391aedff4e14cbad291285afd18f3e89973bf3b025bab1fbfd16584a3c"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.123810 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-487ln" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.125476 4812 generic.go:334] "Generic (PLEG): container finished" podID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerID="a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5" exitCode=0 Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.125521 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sj99w" event={"ID":"d9c1a0d3-b881-4382-89c4-905ad455a360","Type":"ContainerDied","Data":"a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.125540 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-sj99w" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.125541 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-sj99w" event={"ID":"d9c1a0d3-b881-4382-89c4-905ad455a360","Type":"ContainerDied","Data":"9a1f35ffa250e51d2c1f4966b18ada5533fef5d912eb4d60dde54af38de4ada4"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.127888 4812 generic.go:334] "Generic (PLEG): container finished" podID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerID="869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824" exitCode=0 Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.127928 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" event={"ID":"39f52f71-fcee-4193-95db-158c8fe2f71f","Type":"ContainerDied","Data":"869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.127953 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" event={"ID":"39f52f71-fcee-4193-95db-158c8fe2f71f","Type":"ContainerDied","Data":"ee792830d50fec32b4523dd47ae2a02e20fd4c62fcd3100a8dab68384745faea"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.128031 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/marketplace-operator-79b997595-wh59s" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.130690 4812 generic.go:334] "Generic (PLEG): container finished" podID="a16a82f9-4289-4749-bc62-df59dacefac1" containerID="517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673" exitCode=0 Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.130732 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-56x2d" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.130730 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56x2d" event={"ID":"a16a82f9-4289-4749-bc62-df59dacefac1","Type":"ContainerDied","Data":"517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.130945 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-56x2d" event={"ID":"a16a82f9-4289-4749-bc62-df59dacefac1","Type":"ContainerDied","Data":"3c2f9ecd49074dbef0afbbf95d2fa12287e098f61c07e2b5a3bec7ae05b19908"} Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.136176 4812 scope.go:117] "RemoveContainer" containerID="3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.171619 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-sj99w"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.175258 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-sj99w"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.180619 4812 scope.go:117] "RemoveContainer" containerID="8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.188875 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4dd847bf-95c7-48c4-9042-f078db7c8438-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.190350 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfclc"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.193081 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-bfclc"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.247781 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-487ln"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.251802 4812 scope.go:117] "RemoveContainer" containerID="6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.255263 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c\": container with ID starting with 6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c not found: ID does not exist" containerID="6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.255312 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c"} err="failed to get container status \"6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c\": rpc error: code = NotFound desc = could not find container \"6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c\": container with ID starting with 6d23e593f2c32616e2a67489ddfbbabb55f13e144af0499dfc08b8d77b83a58c not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.255343 4812 scope.go:117] "RemoveContainer" containerID="3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.256533 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa\": container with ID starting with 3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa not found: ID does not exist" containerID="3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.256685 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa"} err="failed to get container status \"3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa\": rpc error: code = NotFound desc = could not find container \"3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa\": container with ID starting with 3fe6753d1f29bb145f5fc69b9c6197677f32cd31331022cc48553cab2d48aeaa not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.256796 4812 scope.go:117] "RemoveContainer" containerID="8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.256999 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-487ln"] Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.257551 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62\": container with ID starting with 8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62 not found: ID does not exist" containerID="8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.257669 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62"} err="failed to get container status \"8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62\": rpc error: code = NotFound desc = could not find container \"8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62\": container with ID starting with 8f2428e073b5d38eb4f28ed9f5a70914e89ba89b01213362a259c1dc74fb2a62 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.257761 4812 scope.go:117] "RemoveContainer" containerID="ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.259481 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-56x2d"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.266658 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-56x2d"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.269470 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wh59s"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.272104 4812 scope.go:117] "RemoveContainer" containerID="e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.272191 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/marketplace-operator-79b997595-wh59s"] Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.288899 4812 scope.go:117] "RemoveContainer" containerID="edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.307351 4812 scope.go:117] "RemoveContainer" containerID="ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.307786 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e\": container with ID starting with ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e not found: ID does not exist" containerID="ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.307816 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e"} err="failed to get container status \"ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e\": rpc error: code = NotFound desc = could not find container \"ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e\": container with ID starting with ae05c66f69e1e02517d82b3bed9a6b2e5058b8a1be0a535136cc8ae56c543b5e not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.307840 4812 scope.go:117] "RemoveContainer" containerID="e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.308537 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da\": container with ID starting with e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da not found: ID does not exist" containerID="e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.308852 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da"} err="failed to get container status \"e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da\": rpc error: code = NotFound desc = could not find container \"e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da\": container with ID starting with e7d93a9f80402af657c1e40ed64e499a4b5aa0a87a64b2223fbdfb8a3781f4da not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.308899 4812 scope.go:117] "RemoveContainer" containerID="edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.309384 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755\": container with ID starting with edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755 not found: ID does not exist" containerID="edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.309432 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755"} err="failed to get container status \"edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755\": rpc error: code = NotFound desc = could not find container \"edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755\": container with ID starting with edf6583d5014fdfd760e20e587e008bce2aab11def74b4b8eb883ba64eb07755 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.309461 4812 scope.go:117] "RemoveContainer" containerID="a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.321932 4812 scope.go:117] "RemoveContainer" containerID="78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.339118 4812 scope.go:117] "RemoveContainer" containerID="79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.352604 4812 scope.go:117] "RemoveContainer" containerID="a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.354979 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5\": container with ID starting with a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5 not found: ID does not exist" containerID="a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.355038 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5"} err="failed to get container status \"a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5\": rpc error: code = NotFound desc = could not find container \"a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5\": container with ID starting with a32443c4023bc5aadbf1d3592f05699c789fcaa350f3afb2039265991df79cf5 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.355064 4812 scope.go:117] "RemoveContainer" containerID="78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.355689 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118\": container with ID starting with 78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118 not found: ID does not exist" containerID="78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.355720 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118"} err="failed to get container status \"78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118\": rpc error: code = NotFound desc = could not find container \"78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118\": container with ID starting with 78cad420399379721fd356f80031514a12cca27a708659c771efdb057cda0118 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.355745 4812 scope.go:117] "RemoveContainer" containerID="79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.356041 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062\": container with ID starting with 79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062 not found: ID does not exist" containerID="79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.356090 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062"} err="failed to get container status \"79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062\": rpc error: code = NotFound desc = could not find container \"79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062\": container with ID starting with 79ecf45b689db92850e4c345545b4a366a69707062d2c5e0b71a4549a7e0b062 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.356120 4812 scope.go:117] "RemoveContainer" containerID="869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.370570 4812 scope.go:117] "RemoveContainer" containerID="869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.370914 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824\": container with ID starting with 869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824 not found: ID does not exist" containerID="869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.370955 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824"} err="failed to get container status \"869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824\": rpc error: code = NotFound desc = could not find container \"869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824\": container with ID starting with 869cf44ac858dfd277abe4090e22fa3383fb1327d26b6eb45dd3bc8327b8a824 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.370976 4812 scope.go:117] "RemoveContainer" containerID="517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.384516 4812 scope.go:117] "RemoveContainer" containerID="8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.404537 4812 scope.go:117] "RemoveContainer" containerID="9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.416420 4812 scope.go:117] "RemoveContainer" containerID="517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.416725 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673\": container with ID starting with 517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673 not found: ID does not exist" containerID="517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.416758 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673"} err="failed to get container status \"517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673\": rpc error: code = NotFound desc = could not find container \"517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673\": container with ID starting with 517ff3d712e068a38b31d8830cbc15a89976cdd4436908483be8645f25758673 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.416782 4812 scope.go:117] "RemoveContainer" containerID="8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.417149 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811\": container with ID starting with 8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811 not found: ID does not exist" containerID="8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.417196 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811"} err="failed to get container status \"8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811\": rpc error: code = NotFound desc = could not find container \"8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811\": container with ID starting with 8d5bfdd7f53147778050d09f13e9c6cea7f351340ef230100e74eb4722efc811 not found: ID does not exist" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.417227 4812 scope.go:117] "RemoveContainer" containerID="9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b" Jan 31 04:33:13 crc kubenswrapper[4812]: E0131 04:33:13.417470 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b\": container with ID starting with 9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b not found: ID does not exist" containerID="9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b" Jan 31 04:33:13 crc kubenswrapper[4812]: I0131 04:33:13.417490 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b"} err="failed to get container status \"9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b\": rpc error: code = NotFound desc = could not find container \"9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b\": container with ID starting with 9ab62272c592d818d9f8ba6a6caa9ca310fc06f211b6052eefb025208c2c738b not found: ID does not exist" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.142300 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" event={"ID":"55175f00-9682-4c72-a26a-3b050c99af46","Type":"ContainerStarted","Data":"03c21e7a49fe365ea7290489af78b367d8a2602c2cc44dbad66e1e6bf1d9706e"} Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.143868 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.150678 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.169477 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/marketplace-operator-79b997595-rcfw5" podStartSLOduration=2.169460396 podStartE2EDuration="2.169460396s" podCreationTimestamp="2026-01-31 04:33:12 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:33:14.169171768 +0000 UTC m=+402.664193493" watchObservedRunningTime="2026-01-31 04:33:14.169460396 +0000 UTC m=+402.664482071" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270014 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-km6lg"] Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270290 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270307 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270322 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270331 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270347 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270356 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270401 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270412 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270422 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270430 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270442 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270451 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270464 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270473 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270485 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270493 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270508 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270515 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="extract-utilities" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270526 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270534 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270547 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270555 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270566 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270574 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: E0131 04:33:14.270586 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270594 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="extract-content" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270717 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270733 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270748 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" containerName="marketplace-operator" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270759 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.270771 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" containerName="registry-server" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.271757 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.276933 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-marketplace-dockercfg-x2ctb" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.286695 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-km6lg"] Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.338643 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.338715 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.345183 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="39f52f71-fcee-4193-95db-158c8fe2f71f" path="/var/lib/kubelet/pods/39f52f71-fcee-4193-95db-158c8fe2f71f/volumes" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.345622 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4dd847bf-95c7-48c4-9042-f078db7c8438" path="/var/lib/kubelet/pods/4dd847bf-95c7-48c4-9042-f078db7c8438/volumes" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.346179 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a16a82f9-4289-4749-bc62-df59dacefac1" path="/var/lib/kubelet/pods/a16a82f9-4289-4749-bc62-df59dacefac1/volumes" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.347151 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d6e79cce-8b4e-491b-a976-a3649e3566cd" path="/var/lib/kubelet/pods/d6e79cce-8b4e-491b-a976-a3649e3566cd/volumes" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.347673 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9c1a0d3-b881-4382-89c4-905ad455a360" path="/var/lib/kubelet/pods/d9c1a0d3-b881-4382-89c4-905ad455a360/volumes" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.404384 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-utilities\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.404459 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xpbjn\" (UniqueName: \"kubernetes.io/projected/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-kube-api-access-xpbjn\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.404503 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-catalog-content\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.473433 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-r58bf"] Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.481178 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r58bf"] Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.481385 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.486177 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"certified-operators-dockercfg-4rs5g" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.507420 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-utilities\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.507710 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xpbjn\" (UniqueName: \"kubernetes.io/projected/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-kube-api-access-xpbjn\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.507916 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-catalog-content\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.508489 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-utilities\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.508729 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-catalog-content\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.529323 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xpbjn\" (UniqueName: \"kubernetes.io/projected/663ce6b5-46cb-45ca-9e5c-9ef13d78189f-kube-api-access-xpbjn\") pod \"redhat-marketplace-km6lg\" (UID: \"663ce6b5-46cb-45ca-9e5c-9ef13d78189f\") " pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.590434 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.609514 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9tmf\" (UniqueName: \"kubernetes.io/projected/1e130501-5caf-49a3-bd51-61ecde347414-kube-api-access-w9tmf\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.609598 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e130501-5caf-49a3-bd51-61ecde347414-catalog-content\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.609702 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e130501-5caf-49a3-bd51-61ecde347414-utilities\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.710601 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e130501-5caf-49a3-bd51-61ecde347414-utilities\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.711283 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/1e130501-5caf-49a3-bd51-61ecde347414-utilities\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.711430 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9tmf\" (UniqueName: \"kubernetes.io/projected/1e130501-5caf-49a3-bd51-61ecde347414-kube-api-access-w9tmf\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.711569 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e130501-5caf-49a3-bd51-61ecde347414-catalog-content\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.712079 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/1e130501-5caf-49a3-bd51-61ecde347414-catalog-content\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.730217 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9tmf\" (UniqueName: \"kubernetes.io/projected/1e130501-5caf-49a3-bd51-61ecde347414-kube-api-access-w9tmf\") pod \"certified-operators-r58bf\" (UID: \"1e130501-5caf-49a3-bd51-61ecde347414\") " pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:14 crc kubenswrapper[4812]: I0131 04:33:14.812797 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:15 crc kubenswrapper[4812]: I0131 04:33:15.001065 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-km6lg"] Jan 31 04:33:15 crc kubenswrapper[4812]: W0131 04:33:15.012535 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663ce6b5_46cb_45ca_9e5c_9ef13d78189f.slice/crio-4d81a4aba42d278e7f43d98f2ee9b68c68af3d5a0ebc465c8b5add7cf8aa7a5c WatchSource:0}: Error finding container 4d81a4aba42d278e7f43d98f2ee9b68c68af3d5a0ebc465c8b5add7cf8aa7a5c: Status 404 returned error can't find the container with id 4d81a4aba42d278e7f43d98f2ee9b68c68af3d5a0ebc465c8b5add7cf8aa7a5c Jan 31 04:33:15 crc kubenswrapper[4812]: I0131 04:33:15.155008 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-km6lg" event={"ID":"663ce6b5-46cb-45ca-9e5c-9ef13d78189f","Type":"ContainerStarted","Data":"4d81a4aba42d278e7f43d98f2ee9b68c68af3d5a0ebc465c8b5add7cf8aa7a5c"} Jan 31 04:33:15 crc kubenswrapper[4812]: I0131 04:33:15.262143 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-r58bf"] Jan 31 04:33:15 crc kubenswrapper[4812]: W0131 04:33:15.330605 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e130501_5caf_49a3_bd51_61ecde347414.slice/crio-443bbb2e33610a04b2082cb481a2248bcaab1a7a558c72ed8aa66c5c5afc2941 WatchSource:0}: Error finding container 443bbb2e33610a04b2082cb481a2248bcaab1a7a558c72ed8aa66c5c5afc2941: Status 404 returned error can't find the container with id 443bbb2e33610a04b2082cb481a2248bcaab1a7a558c72ed8aa66c5c5afc2941 Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.164223 4812 generic.go:334] "Generic (PLEG): container finished" podID="663ce6b5-46cb-45ca-9e5c-9ef13d78189f" containerID="3df836f3e847356a03e3002f229dc038c144f6066ef0dbf477d2e590e70351e7" exitCode=0 Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.164330 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-km6lg" event={"ID":"663ce6b5-46cb-45ca-9e5c-9ef13d78189f","Type":"ContainerDied","Data":"3df836f3e847356a03e3002f229dc038c144f6066ef0dbf477d2e590e70351e7"} Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.168875 4812 generic.go:334] "Generic (PLEG): container finished" podID="1e130501-5caf-49a3-bd51-61ecde347414" containerID="e60a00b36a479fb3bae3b34e7cc90ff9e50b717103280d08a0f7eab98207efa9" exitCode=0 Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.168964 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r58bf" event={"ID":"1e130501-5caf-49a3-bd51-61ecde347414","Type":"ContainerDied","Data":"e60a00b36a479fb3bae3b34e7cc90ff9e50b717103280d08a0f7eab98207efa9"} Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.169013 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r58bf" event={"ID":"1e130501-5caf-49a3-bd51-61ecde347414","Type":"ContainerStarted","Data":"443bbb2e33610a04b2082cb481a2248bcaab1a7a558c72ed8aa66c5c5afc2941"} Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.667217 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-jxdsd"] Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.669474 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.673556 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"community-operators-dockercfg-dmngl" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.681484 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxdsd"] Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.840564 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-catalog-content\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.841051 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q45mr\" (UniqueName: \"kubernetes.io/projected/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-kube-api-access-q45mr\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.841145 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-utilities\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.872706 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-9l62r"] Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.874841 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.878689 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"redhat-operators-dockercfg-ct8rh" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.879060 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9l62r"] Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.942710 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-catalog-content\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.942975 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-q45mr\" (UniqueName: \"kubernetes.io/projected/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-kube-api-access-q45mr\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.943021 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-utilities\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.945945 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-catalog-content\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.961018 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-utilities\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.971045 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-q45mr\" (UniqueName: \"kubernetes.io/projected/567d2603-f5cd-4e06-a78d-d0ad581f7d3f-kube-api-access-q45mr\") pod \"community-operators-jxdsd\" (UID: \"567d2603-f5cd-4e06-a78d-d0ad581f7d3f\") " pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:16 crc kubenswrapper[4812]: I0131 04:33:16.985916 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.044418 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnc28\" (UniqueName: \"kubernetes.io/projected/15084025-2b02-454c-9b65-e2e943d80e39-kube-api-access-qnc28\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.044468 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15084025-2b02-454c-9b65-e2e943d80e39-utilities\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.044497 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15084025-2b02-454c-9b65-e2e943d80e39-catalog-content\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.145239 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qnc28\" (UniqueName: \"kubernetes.io/projected/15084025-2b02-454c-9b65-e2e943d80e39-kube-api-access-qnc28\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.145652 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15084025-2b02-454c-9b65-e2e943d80e39-utilities\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.145690 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15084025-2b02-454c-9b65-e2e943d80e39-catalog-content\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.146747 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/15084025-2b02-454c-9b65-e2e943d80e39-utilities\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.148419 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/15084025-2b02-454c-9b65-e2e943d80e39-catalog-content\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.164632 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnc28\" (UniqueName: \"kubernetes.io/projected/15084025-2b02-454c-9b65-e2e943d80e39-kube-api-access-qnc28\") pod \"redhat-operators-9l62r\" (UID: \"15084025-2b02-454c-9b65-e2e943d80e39\") " pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.177527 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-km6lg" event={"ID":"663ce6b5-46cb-45ca-9e5c-9ef13d78189f","Type":"ContainerStarted","Data":"2ef22e6f0a415e5d05c5b1f21c43fb7caafb294419c6363e3c748011221e6d62"} Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.180106 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r58bf" event={"ID":"1e130501-5caf-49a3-bd51-61ecde347414","Type":"ContainerStarted","Data":"b31d16b333af69ab244bd57ca14316f14e15d3f5318c335c32d93c77db413c69"} Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.195787 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:17 crc kubenswrapper[4812]: E0131 04:33:17.382443 4812 cadvisor_stats_provider.go:516] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663ce6b5_46cb_45ca_9e5c_9ef13d78189f.slice/crio-2ef22e6f0a415e5d05c5b1f21c43fb7caafb294419c6363e3c748011221e6d62.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod663ce6b5_46cb_45ca_9e5c_9ef13d78189f.slice/crio-conmon-2ef22e6f0a415e5d05c5b1f21c43fb7caafb294419c6363e3c748011221e6d62.scope\": RecentStats: unable to find data in memory cache]" Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.428796 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-jxdsd"] Jan 31 04:33:17 crc kubenswrapper[4812]: I0131 04:33:17.676812 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-9l62r"] Jan 31 04:33:17 crc kubenswrapper[4812]: W0131 04:33:17.681790 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15084025_2b02_454c_9b65_e2e943d80e39.slice/crio-85ed9828eb513136b2512406dacc4d4dad2b3692cd3b8c847e9c60ee82e04c0d WatchSource:0}: Error finding container 85ed9828eb513136b2512406dacc4d4dad2b3692cd3b8c847e9c60ee82e04c0d: Status 404 returned error can't find the container with id 85ed9828eb513136b2512406dacc4d4dad2b3692cd3b8c847e9c60ee82e04c0d Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.195180 4812 generic.go:334] "Generic (PLEG): container finished" podID="663ce6b5-46cb-45ca-9e5c-9ef13d78189f" containerID="2ef22e6f0a415e5d05c5b1f21c43fb7caafb294419c6363e3c748011221e6d62" exitCode=0 Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.195284 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-km6lg" event={"ID":"663ce6b5-46cb-45ca-9e5c-9ef13d78189f","Type":"ContainerDied","Data":"2ef22e6f0a415e5d05c5b1f21c43fb7caafb294419c6363e3c748011221e6d62"} Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.203410 4812 generic.go:334] "Generic (PLEG): container finished" podID="1e130501-5caf-49a3-bd51-61ecde347414" containerID="b31d16b333af69ab244bd57ca14316f14e15d3f5318c335c32d93c77db413c69" exitCode=0 Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.203470 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r58bf" event={"ID":"1e130501-5caf-49a3-bd51-61ecde347414","Type":"ContainerDied","Data":"b31d16b333af69ab244bd57ca14316f14e15d3f5318c335c32d93c77db413c69"} Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.208133 4812 generic.go:334] "Generic (PLEG): container finished" podID="15084025-2b02-454c-9b65-e2e943d80e39" containerID="6cd70202056e880968262880a71cfa300eb5f40a01d859dede237e079a85fe22" exitCode=0 Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.208194 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l62r" event={"ID":"15084025-2b02-454c-9b65-e2e943d80e39","Type":"ContainerDied","Data":"6cd70202056e880968262880a71cfa300eb5f40a01d859dede237e079a85fe22"} Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.208217 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l62r" event={"ID":"15084025-2b02-454c-9b65-e2e943d80e39","Type":"ContainerStarted","Data":"85ed9828eb513136b2512406dacc4d4dad2b3692cd3b8c847e9c60ee82e04c0d"} Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.212334 4812 generic.go:334] "Generic (PLEG): container finished" podID="567d2603-f5cd-4e06-a78d-d0ad581f7d3f" containerID="4358ba3021eb712a4cc9466d82f63401d26afd4406da7ebaf7474716789c6519" exitCode=0 Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.212399 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxdsd" event={"ID":"567d2603-f5cd-4e06-a78d-d0ad581f7d3f","Type":"ContainerDied","Data":"4358ba3021eb712a4cc9466d82f63401d26afd4406da7ebaf7474716789c6519"} Jan 31 04:33:18 crc kubenswrapper[4812]: I0131 04:33:18.212423 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxdsd" event={"ID":"567d2603-f5cd-4e06-a78d-d0ad581f7d3f","Type":"ContainerStarted","Data":"b34e2cc8a00ff2991ec534a44d3ad5853cf89d4e7bedd3ea871c76f571d67438"} Jan 31 04:33:19 crc kubenswrapper[4812]: I0131 04:33:19.224202 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l62r" event={"ID":"15084025-2b02-454c-9b65-e2e943d80e39","Type":"ContainerStarted","Data":"e94e7d0b673b4ca12cc02fabeb9b7684e2b3f2476d8e96131794bfb5c44dcae9"} Jan 31 04:33:19 crc kubenswrapper[4812]: I0131 04:33:19.227105 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxdsd" event={"ID":"567d2603-f5cd-4e06-a78d-d0ad581f7d3f","Type":"ContainerStarted","Data":"ee151c5aa1aed2f5b2de1b62c1c4ac8d7409edc0dbf81ed59c47f20623b0404a"} Jan 31 04:33:19 crc kubenswrapper[4812]: I0131 04:33:19.229107 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-km6lg" event={"ID":"663ce6b5-46cb-45ca-9e5c-9ef13d78189f","Type":"ContainerStarted","Data":"c3218ba86a2f6f68e82bab66eabb1b1c1340ed18ec42cee3bbe2bb59f9b315ee"} Jan 31 04:33:19 crc kubenswrapper[4812]: I0131 04:33:19.231250 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-r58bf" event={"ID":"1e130501-5caf-49a3-bd51-61ecde347414","Type":"ContainerStarted","Data":"2cd94c6a3507536690a996975c596c3b10613863a25e3d1f652563d5a2ec3bc2"} Jan 31 04:33:19 crc kubenswrapper[4812]: I0131 04:33:19.275645 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-r58bf" podStartSLOduration=2.768316666 podStartE2EDuration="5.275630092s" podCreationTimestamp="2026-01-31 04:33:14 +0000 UTC" firstStartedPulling="2026-01-31 04:33:16.170473411 +0000 UTC m=+404.665495076" lastFinishedPulling="2026-01-31 04:33:18.677786827 +0000 UTC m=+407.172808502" observedRunningTime="2026-01-31 04:33:19.269987356 +0000 UTC m=+407.765009021" watchObservedRunningTime="2026-01-31 04:33:19.275630092 +0000 UTC m=+407.770651757" Jan 31 04:33:19 crc kubenswrapper[4812]: I0131 04:33:19.289680 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-km6lg" podStartSLOduration=2.817078941 podStartE2EDuration="5.289665769s" podCreationTimestamp="2026-01-31 04:33:14 +0000 UTC" firstStartedPulling="2026-01-31 04:33:16.166215244 +0000 UTC m=+404.661236929" lastFinishedPulling="2026-01-31 04:33:18.638802092 +0000 UTC m=+407.133823757" observedRunningTime="2026-01-31 04:33:19.286741828 +0000 UTC m=+407.781763493" watchObservedRunningTime="2026-01-31 04:33:19.289665769 +0000 UTC m=+407.784687434" Jan 31 04:33:20 crc kubenswrapper[4812]: I0131 04:33:20.239633 4812 generic.go:334] "Generic (PLEG): container finished" podID="15084025-2b02-454c-9b65-e2e943d80e39" containerID="e94e7d0b673b4ca12cc02fabeb9b7684e2b3f2476d8e96131794bfb5c44dcae9" exitCode=0 Jan 31 04:33:20 crc kubenswrapper[4812]: I0131 04:33:20.239715 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l62r" event={"ID":"15084025-2b02-454c-9b65-e2e943d80e39","Type":"ContainerDied","Data":"e94e7d0b673b4ca12cc02fabeb9b7684e2b3f2476d8e96131794bfb5c44dcae9"} Jan 31 04:33:20 crc kubenswrapper[4812]: I0131 04:33:20.241511 4812 generic.go:334] "Generic (PLEG): container finished" podID="567d2603-f5cd-4e06-a78d-d0ad581f7d3f" containerID="ee151c5aa1aed2f5b2de1b62c1c4ac8d7409edc0dbf81ed59c47f20623b0404a" exitCode=0 Jan 31 04:33:20 crc kubenswrapper[4812]: I0131 04:33:20.241544 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxdsd" event={"ID":"567d2603-f5cd-4e06-a78d-d0ad581f7d3f","Type":"ContainerDied","Data":"ee151c5aa1aed2f5b2de1b62c1c4ac8d7409edc0dbf81ed59c47f20623b0404a"} Jan 31 04:33:21 crc kubenswrapper[4812]: I0131 04:33:21.252875 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-9l62r" event={"ID":"15084025-2b02-454c-9b65-e2e943d80e39","Type":"ContainerStarted","Data":"ac62c779c4585aa1a520be201c251637a7200bf989e23eceb552d6c958011267"} Jan 31 04:33:21 crc kubenswrapper[4812]: I0131 04:33:21.257423 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-jxdsd" event={"ID":"567d2603-f5cd-4e06-a78d-d0ad581f7d3f","Type":"ContainerStarted","Data":"caa8a2f16ab2678ea0541b5c7071246b7ab89c97196a91a3047aba51df3ba6a3"} Jan 31 04:33:21 crc kubenswrapper[4812]: I0131 04:33:21.279429 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-9l62r" podStartSLOduration=2.742951714 podStartE2EDuration="5.279406283s" podCreationTimestamp="2026-01-31 04:33:16 +0000 UTC" firstStartedPulling="2026-01-31 04:33:18.209369341 +0000 UTC m=+406.704391006" lastFinishedPulling="2026-01-31 04:33:20.74582388 +0000 UTC m=+409.240845575" observedRunningTime="2026-01-31 04:33:21.274762335 +0000 UTC m=+409.769784000" watchObservedRunningTime="2026-01-31 04:33:21.279406283 +0000 UTC m=+409.774427988" Jan 31 04:33:21 crc kubenswrapper[4812]: I0131 04:33:21.306082 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-jxdsd" podStartSLOduration=2.860317249 podStartE2EDuration="5.306061478s" podCreationTimestamp="2026-01-31 04:33:16 +0000 UTC" firstStartedPulling="2026-01-31 04:33:18.214978755 +0000 UTC m=+406.710000460" lastFinishedPulling="2026-01-31 04:33:20.660723004 +0000 UTC m=+409.155744689" observedRunningTime="2026-01-31 04:33:21.305015999 +0000 UTC m=+409.800037754" watchObservedRunningTime="2026-01-31 04:33:21.306061478 +0000 UTC m=+409.801083143" Jan 31 04:33:24 crc kubenswrapper[4812]: I0131 04:33:24.591526 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:24 crc kubenswrapper[4812]: I0131 04:33:24.591798 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:24 crc kubenswrapper[4812]: I0131 04:33:24.674427 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:24 crc kubenswrapper[4812]: I0131 04:33:24.814177 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:24 crc kubenswrapper[4812]: I0131 04:33:24.814251 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:24 crc kubenswrapper[4812]: I0131 04:33:24.883410 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:25 crc kubenswrapper[4812]: I0131 04:33:25.326886 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-km6lg" Jan 31 04:33:25 crc kubenswrapper[4812]: I0131 04:33:25.376392 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-r58bf" Jan 31 04:33:26 crc kubenswrapper[4812]: I0131 04:33:26.985973 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:26 crc kubenswrapper[4812]: I0131 04:33:26.986244 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:27 crc kubenswrapper[4812]: I0131 04:33:27.055802 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:27 crc kubenswrapper[4812]: I0131 04:33:27.196885 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:27 crc kubenswrapper[4812]: I0131 04:33:27.197092 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:27 crc kubenswrapper[4812]: I0131 04:33:27.358095 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-jxdsd" Jan 31 04:33:28 crc kubenswrapper[4812]: I0131 04:33:28.258375 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-9l62r" podUID="15084025-2b02-454c-9b65-e2e943d80e39" containerName="registry-server" probeResult="failure" output=< Jan 31 04:33:28 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:33:28 crc kubenswrapper[4812]: > Jan 31 04:33:35 crc kubenswrapper[4812]: I0131 04:33:35.962271 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" podUID="a2bf1dee-41d8-4797-ae33-0e659438727b" containerName="registry" containerID="cri-o://c0a4d23b092e2b0f54b4c81e0604d5afd4d1dc7de77db4f65d5f50ca3345499e" gracePeriod=30 Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.343727 4812 generic.go:334] "Generic (PLEG): container finished" podID="a2bf1dee-41d8-4797-ae33-0e659438727b" containerID="c0a4d23b092e2b0f54b4c81e0604d5afd4d1dc7de77db4f65d5f50ca3345499e" exitCode=0 Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.348999 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" event={"ID":"a2bf1dee-41d8-4797-ae33-0e659438727b","Type":"ContainerDied","Data":"c0a4d23b092e2b0f54b4c81e0604d5afd4d1dc7de77db4f65d5f50ca3345499e"} Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.488831 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.638295 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a2bf1dee-41d8-4797-ae33-0e659438727b-ca-trust-extracted\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.638394 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-tls\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.638435 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4g9lm\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-kube-api-access-4g9lm\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.638495 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-certificates\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.638568 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a2bf1dee-41d8-4797-ae33-0e659438727b-installation-pull-secrets\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.638955 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"registry-storage\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.639060 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-bound-sa-token\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.639145 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-trusted-ca\") pod \"a2bf1dee-41d8-4797-ae33-0e659438727b\" (UID: \"a2bf1dee-41d8-4797-ae33-0e659438727b\") " Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.641108 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.641543 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.648724 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-kube-api-access-4g9lm" (OuterVolumeSpecName: "kube-api-access-4g9lm") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "kube-api-access-4g9lm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.649130 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a2bf1dee-41d8-4797-ae33-0e659438727b-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.649262 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.661064 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.668404 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8" (OuterVolumeSpecName: "registry-storage") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "pvc-657094db-63f1-4ba8-9a24-edca0e80b7a8". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.672317 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a2bf1dee-41d8-4797-ae33-0e659438727b-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "a2bf1dee-41d8-4797-ae33-0e659438727b" (UID: "a2bf1dee-41d8-4797-ae33-0e659438727b"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741451 4812 reconciler_common.go:293] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-tls\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741509 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4g9lm\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-kube-api-access-4g9lm\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741531 4812 reconciler_common.go:293] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-registry-certificates\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741552 4812 reconciler_common.go:293] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/a2bf1dee-41d8-4797-ae33-0e659438727b-installation-pull-secrets\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741569 4812 reconciler_common.go:293] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a2bf1dee-41d8-4797-ae33-0e659438727b-bound-sa-token\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741586 4812 reconciler_common.go:293] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/a2bf1dee-41d8-4797-ae33-0e659438727b-trusted-ca\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:36 crc kubenswrapper[4812]: I0131 04:33:36.741602 4812 reconciler_common.go:293] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/a2bf1dee-41d8-4797-ae33-0e659438727b-ca-trust-extracted\") on node \"crc\" DevicePath \"\"" Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.266141 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.329444 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-9l62r" Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.368221 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.368269 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-697d97f7c8-5c747" event={"ID":"a2bf1dee-41d8-4797-ae33-0e659438727b","Type":"ContainerDied","Data":"af78a33f86b096dffb569eb2bd6d458c4004524ec53d2803636625378c8cf385"} Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.368307 4812 scope.go:117] "RemoveContainer" containerID="c0a4d23b092e2b0f54b4c81e0604d5afd4d1dc7de77db4f65d5f50ca3345499e" Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.427423 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5c747"] Jan 31 04:33:37 crc kubenswrapper[4812]: I0131 04:33:37.430765 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-697d97f7c8-5c747"] Jan 31 04:33:38 crc kubenswrapper[4812]: I0131 04:33:38.346637 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a2bf1dee-41d8-4797-ae33-0e659438727b" path="/var/lib/kubelet/pods/a2bf1dee-41d8-4797-ae33-0e659438727b/volumes" Jan 31 04:33:44 crc kubenswrapper[4812]: I0131 04:33:44.338428 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:33:44 crc kubenswrapper[4812]: I0131 04:33:44.338808 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:33:44 crc kubenswrapper[4812]: I0131 04:33:44.352459 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:33:44 crc kubenswrapper[4812]: I0131 04:33:44.353343 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3e331ce48921e7625aeb93566be87bb5941cdb39529818989611c844d2ef521"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:33:44 crc kubenswrapper[4812]: I0131 04:33:44.353449 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://a3e331ce48921e7625aeb93566be87bb5941cdb39529818989611c844d2ef521" gracePeriod=600 Jan 31 04:33:45 crc kubenswrapper[4812]: I0131 04:33:45.425271 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="a3e331ce48921e7625aeb93566be87bb5941cdb39529818989611c844d2ef521" exitCode=0 Jan 31 04:33:45 crc kubenswrapper[4812]: I0131 04:33:45.425656 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"a3e331ce48921e7625aeb93566be87bb5941cdb39529818989611c844d2ef521"} Jan 31 04:33:45 crc kubenswrapper[4812]: I0131 04:33:45.425725 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"a3a494ee07f97dfd2482ef89bc2ad119ec04f37ba57a9282577848803151a65d"} Jan 31 04:33:45 crc kubenswrapper[4812]: I0131 04:33:45.425750 4812 scope.go:117] "RemoveContainer" containerID="f8748bb1ca09274116febb15cd6e489a000f8e42d659bdf78629e30c26cc52de" Jan 31 04:36:14 crc kubenswrapper[4812]: I0131 04:36:14.338422 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:36:14 crc kubenswrapper[4812]: I0131 04:36:14.339350 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:36:44 crc kubenswrapper[4812]: I0131 04:36:44.338120 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:36:44 crc kubenswrapper[4812]: I0131 04:36:44.338903 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.338091 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.338960 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.356164 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.357003 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"a3a494ee07f97dfd2482ef89bc2ad119ec04f37ba57a9282577848803151a65d"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.357113 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://a3a494ee07f97dfd2482ef89bc2ad119ec04f37ba57a9282577848803151a65d" gracePeriod=600 Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.852184 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="a3a494ee07f97dfd2482ef89bc2ad119ec04f37ba57a9282577848803151a65d" exitCode=0 Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.852288 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"a3a494ee07f97dfd2482ef89bc2ad119ec04f37ba57a9282577848803151a65d"} Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.852676 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"8eec189e5f64e5907eb85688b79b50a9ecc03fb99c8c2ed8b673292e41a75382"} Jan 31 04:37:14 crc kubenswrapper[4812]: I0131 04:37:14.852712 4812 scope.go:117] "RemoveContainer" containerID="a3e331ce48921e7625aeb93566be87bb5941cdb39529818989611c844d2ef521" Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.862218 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2f9"] Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.864513 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-ovn-metrics" containerID="cri-o://780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.864571 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-node" containerID="cri-o://88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.864696 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="sbdb" containerID="cri-o://52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.864601 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-acl-logging" containerID="cri-o://713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.864871 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="northd" containerID="cri-o://482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.864497 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="nbdb" containerID="cri-o://858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.863829 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-controller" containerID="cri-o://1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769" gracePeriod=30 Jan 31 04:38:23 crc kubenswrapper[4812]: I0131 04:38:23.894011 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" containerID="cri-o://1df698192345d3a05e597c4d8c10555bf54aaa604d42000637efa3cc4157d915" gracePeriod=30 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.297883 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovnkube-controller/3.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.301562 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovn-acl-logging/0.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302079 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovn-controller/0.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302455 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="1df698192345d3a05e597c4d8c10555bf54aaa604d42000637efa3cc4157d915" exitCode=0 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302490 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78" exitCode=0 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302504 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78" exitCode=0 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302516 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924" exitCode=0 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302525 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a" exitCode=0 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302535 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1" exitCode=0 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302544 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b" exitCode=143 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302552 4812 generic.go:334] "Generic (PLEG): container finished" podID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerID="1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769" exitCode=143 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302601 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"1df698192345d3a05e597c4d8c10555bf54aaa604d42000637efa3cc4157d915"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302639 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302652 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302662 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302675 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302687 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302698 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302710 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.302732 4812 scope.go:117] "RemoveContainer" containerID="3f807c987793533a982b9cd41b19567ced60b70c44502fc3177afcd139a61a92" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.305253 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/2.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.305788 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/1.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.305831 4812 generic.go:334] "Generic (PLEG): container finished" podID="6050f642-2492-4f83-a739-ac905c409b8c" containerID="ec3f00b03424a296f7cefb61c7c1482bfba10c00049acb0b2119850f63d75f9d" exitCode=2 Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.305895 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerDied","Data":"ec3f00b03424a296f7cefb61c7c1482bfba10c00049acb0b2119850f63d75f9d"} Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.306499 4812 scope.go:117] "RemoveContainer" containerID="ec3f00b03424a296f7cefb61c7c1482bfba10c00049acb0b2119850f63d75f9d" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.308082 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pnwcx_openshift-multus(6050f642-2492-4f83-a739-ac905c409b8c)\"" pod="openshift-multus/multus-pnwcx" podUID="6050f642-2492-4f83-a739-ac905c409b8c" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.359221 4812 scope.go:117] "RemoveContainer" containerID="4e5af758af7ea2bfbeb8743f61b12dfbf1dc47939ed12d3a92828de57500f0fb" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.905018 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovn-acl-logging/0.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.905825 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovn-controller/0.log" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.906493 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955188 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-etc-openvswitch\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955265 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cvm2v\" (UniqueName: \"kubernetes.io/projected/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-kube-api-access-cvm2v\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955318 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-log-socket\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955351 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-systemd\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955383 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-var-lib-openvswitch\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955418 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955461 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovn-node-metrics-cert\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955492 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-slash\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955538 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-kubelet\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955579 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-netns\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955618 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-env-overrides\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955650 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-ovn\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955697 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-node-log\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955734 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-script-lib\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955765 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-netd\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955793 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-openvswitch\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955835 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-config\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955892 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-systemd-units\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955921 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-ovn-kubernetes\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.955953 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-bin\") pod \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\" (UID: \"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a\") " Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.956142 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-netns" (OuterVolumeSpecName: "host-run-netns") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-run-netns". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.956202 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-etc-openvswitch" (OuterVolumeSpecName: "etc-openvswitch") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "etc-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.957430 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-netd" (OuterVolumeSpecName: "host-cni-netd") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-cni-netd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.957770 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-node-log" (OuterVolumeSpecName: "node-log") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "node-log". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.957770 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-ovn" (OuterVolumeSpecName: "run-ovn") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "run-ovn". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.957868 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-var-lib-cni-networks-ovn-kubernetes" (OuterVolumeSpecName: "host-var-lib-cni-networks-ovn-kubernetes") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-var-lib-cni-networks-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.957897 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-log-socket" (OuterVolumeSpecName: "log-socket") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "log-socket". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958138 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-env-overrides" (OuterVolumeSpecName: "env-overrides") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "env-overrides". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958293 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-ovn-kubernetes" (OuterVolumeSpecName: "host-run-ovn-kubernetes") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-run-ovn-kubernetes". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958310 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-systemd-units" (OuterVolumeSpecName: "systemd-units") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "systemd-units". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958334 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-slash" (OuterVolumeSpecName: "host-slash") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-slash". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958379 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-kubelet" (OuterVolumeSpecName: "host-kubelet") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-kubelet". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958444 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-openvswitch" (OuterVolumeSpecName: "run-openvswitch") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "run-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958511 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-var-lib-openvswitch" (OuterVolumeSpecName: "var-lib-openvswitch") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "var-lib-openvswitch". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958595 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-bin" (OuterVolumeSpecName: "host-cni-bin") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "host-cni-bin". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958770 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-config" (OuterVolumeSpecName: "ovnkube-config") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "ovnkube-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.958786 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-script-lib" (OuterVolumeSpecName: "ovnkube-script-lib") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "ovnkube-script-lib". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.959729 4812 reconciler_common.go:293] "Volume detached for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-bin\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.959787 4812 reconciler_common.go:293] "Volume detached for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-etc-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961443 4812 reconciler_common.go:293] "Volume detached for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-log-socket\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961498 4812 reconciler_common.go:293] "Volume detached for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-var-lib-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961615 4812 reconciler_common.go:293] "Volume detached for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-var-lib-cni-networks-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961652 4812 reconciler_common.go:293] "Volume detached for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-slash\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961684 4812 reconciler_common.go:293] "Volume detached for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-kubelet\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961710 4812 reconciler_common.go:293] "Volume detached for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-netns\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961734 4812 reconciler_common.go:293] "Volume detached for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-env-overrides\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961761 4812 reconciler_common.go:293] "Volume detached for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-ovn\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961785 4812 reconciler_common.go:293] "Volume detached for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-node-log\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961810 4812 reconciler_common.go:293] "Volume detached for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-script-lib\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961872 4812 reconciler_common.go:293] "Volume detached for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-cni-netd\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961899 4812 reconciler_common.go:293] "Volume detached for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-openvswitch\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961924 4812 reconciler_common.go:293] "Volume detached for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovnkube-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961950 4812 reconciler_common.go:293] "Volume detached for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-systemd-units\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.961976 4812 reconciler_common.go:293] "Volume detached for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-host-run-ovn-kubernetes\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.963809 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovn-node-metrics-cert" (OuterVolumeSpecName: "ovn-node-metrics-cert") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "ovn-node-metrics-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.964304 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-kube-api-access-cvm2v" (OuterVolumeSpecName: "kube-api-access-cvm2v") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "kube-api-access-cvm2v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983015 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-b4djn"] Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983224 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983239 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983249 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a2bf1dee-41d8-4797-ae33-0e659438727b" containerName="registry" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983255 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a2bf1dee-41d8-4797-ae33-0e659438727b" containerName="registry" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983263 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983270 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983281 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983290 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983301 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kubecfg-setup" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983308 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kubecfg-setup" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983317 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983324 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983335 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="nbdb" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983341 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="nbdb" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983351 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="northd" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983358 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="northd" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983365 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983372 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983380 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-node" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983385 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-node" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983395 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="sbdb" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983401 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="sbdb" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983408 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-acl-logging" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983413 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-acl-logging" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983422 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983427 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983515 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="northd" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983526 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983534 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983542 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983553 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a2bf1dee-41d8-4797-ae33-0e659438727b" containerName="registry" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983562 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-ovn-metrics" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983570 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983602 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="nbdb" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983613 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="sbdb" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983620 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovn-acl-logging" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983629 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983637 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="kube-rbac-proxy-node" Jan 31 04:38:24 crc kubenswrapper[4812]: E0131 04:38:24.983749 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983759 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.983889 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" containerName="ovnkube-controller" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.985635 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:24 crc kubenswrapper[4812]: I0131 04:38:24.987621 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-systemd" (OuterVolumeSpecName: "run-systemd") pod "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" (UID: "d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a"). InnerVolumeSpecName "run-systemd". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063058 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-systemd-units\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063117 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063201 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-cni-bin\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063238 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-run-netns\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063255 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-cni-netd\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063277 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovnkube-script-lib\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063305 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063327 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-etc-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063455 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqvr7\" (UniqueName: \"kubernetes.io/projected/fe7a1e2f-8997-483f-a57e-4c699b99be50-kube-api-access-rqvr7\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063507 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovn-node-metrics-cert\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063579 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-var-lib-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063602 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-ovn\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063620 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-log-socket\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063656 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-kubelet\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063676 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-env-overrides\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063693 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063722 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-node-log\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063744 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovnkube-config\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063776 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-slash\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063800 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-systemd\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063915 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-cvm2v\" (UniqueName: \"kubernetes.io/projected/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-kube-api-access-cvm2v\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063938 4812 reconciler_common.go:293] "Volume detached for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-run-systemd\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.063950 4812 reconciler_common.go:293] "Volume detached for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a-ovn-node-metrics-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165139 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165222 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-systemd-units\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165281 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-cni-bin\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165324 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-run-ovn-kubernetes\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165332 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-run-netns\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165385 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-run-netns\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165431 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-cni-netd\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165494 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovnkube-script-lib\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165501 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-cni-netd\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165519 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-cni-bin\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165557 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165438 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-systemd-units\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165737 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165751 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-etc-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165797 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-etc-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165912 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvr7\" (UniqueName: \"kubernetes.io/projected/fe7a1e2f-8997-483f-a57e-4c699b99be50-kube-api-access-rqvr7\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.165959 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovn-node-metrics-cert\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166009 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-var-lib-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166042 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-ovn\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166073 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-log-socket\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166104 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-kubelet\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166135 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-env-overrides\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166165 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166204 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-node-log\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166237 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-ovn\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166244 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovnkube-config\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166320 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-slash\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166338 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166359 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-systemd\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166394 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-log-socket\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166538 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-run-systemd\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166599 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-var-lib-openvswitch\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166634 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-kubelet\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166648 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-host-slash\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.166685 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/fe7a1e2f-8997-483f-a57e-4c699b99be50-node-log\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.167203 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovnkube-script-lib\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.167238 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-env-overrides\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.168036 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovnkube-config\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.172980 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/fe7a1e2f-8997-483f-a57e-4c699b99be50-ovn-node-metrics-cert\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.197961 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqvr7\" (UniqueName: \"kubernetes.io/projected/fe7a1e2f-8997-483f-a57e-4c699b99be50-kube-api-access-rqvr7\") pod \"ovnkube-node-b4djn\" (UID: \"fe7a1e2f-8997-483f-a57e-4c699b99be50\") " pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.318323 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovn-acl-logging/0.log" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.319445 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bl2f9_d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/ovn-controller/0.log" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.320137 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" event={"ID":"d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a","Type":"ContainerDied","Data":"33845ad6b8799d85829540bc9956f770d623a8e21cddb107e3c3705d4ffc7930"} Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.320199 4812 scope.go:117] "RemoveContainer" containerID="1df698192345d3a05e597c4d8c10555bf54aaa604d42000637efa3cc4157d915" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.320278 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bl2f9" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.322426 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/2.log" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.339751 4812 scope.go:117] "RemoveContainer" containerID="52af3ab981d7643d40f507c126aba7e4395e3ba998f9f4f62cd6df42bdfe6e78" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.344128 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.372521 4812 scope.go:117] "RemoveContainer" containerID="858650bcce566b16d6bb753b8f017e947ae407e5ba6f1ba337a07ac4f5315f78" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.392502 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2f9"] Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.397322 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-ovn-kubernetes/ovnkube-node-bl2f9"] Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.423073 4812 scope.go:117] "RemoveContainer" containerID="482454181ab8bb2a1300159b59f5e38f758d23d2bfb9f6dd969c6231e1f7f924" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.443775 4812 scope.go:117] "RemoveContainer" containerID="780fba42b87251dc85021d920e58f3b3f4c4e703ab0a754d02924e89c578400a" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.462308 4812 scope.go:117] "RemoveContainer" containerID="88346a13461fa251e6a120ec496e088037646917f43ebcd524c49710f8c6c0f1" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.482943 4812 scope.go:117] "RemoveContainer" containerID="713b548f3c088bbf3186c73f7e3374c110ecb43d18f2a25e1d5e80adf8afcd8b" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.498794 4812 scope.go:117] "RemoveContainer" containerID="1fda093de0864491cf4511641f778bc281478240b666f5abcbde5afed473b769" Jan 31 04:38:25 crc kubenswrapper[4812]: I0131 04:38:25.520034 4812 scope.go:117] "RemoveContainer" containerID="66f30b69fe852c98c7d0a1eac34421997953b58e3d711c1d2c2f05514615125f" Jan 31 04:38:26 crc kubenswrapper[4812]: I0131 04:38:26.329974 4812 generic.go:334] "Generic (PLEG): container finished" podID="fe7a1e2f-8997-483f-a57e-4c699b99be50" containerID="c0cfddc1f28396a4b9157339474740047f767197a78bd04a3fb0956576554565" exitCode=0 Jan 31 04:38:26 crc kubenswrapper[4812]: I0131 04:38:26.330071 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerDied","Data":"c0cfddc1f28396a4b9157339474740047f767197a78bd04a3fb0956576554565"} Jan 31 04:38:26 crc kubenswrapper[4812]: I0131 04:38:26.330454 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"10919ccaf770fbe99f274f187c40d457116c23d6af179a173cceace7abe35685"} Jan 31 04:38:26 crc kubenswrapper[4812]: I0131 04:38:26.351587 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a" path="/var/lib/kubelet/pods/d92c2935-9600-4e3b-b6ef-2be7b6d9ef7a/volumes" Jan 31 04:38:27 crc kubenswrapper[4812]: I0131 04:38:27.346775 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"9731aa7343f2ea97c6c8bf391e6928990eb694d82daedf4b8df3c754ef533439"} Jan 31 04:38:27 crc kubenswrapper[4812]: I0131 04:38:27.347647 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"1a3a4d951fc962a2de6df940645ef04aa018447f779838fc0debbe737451c90a"} Jan 31 04:38:27 crc kubenswrapper[4812]: I0131 04:38:27.347662 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"0ecb75355b18545412dad3a88fb45f898e4a038cff4299dfeabc73f295c61924"} Jan 31 04:38:27 crc kubenswrapper[4812]: I0131 04:38:27.347673 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"d9a095417dbbea5fa33b3906536f641671b6c3ef17d7d27dbd47b1db1731953b"} Jan 31 04:38:27 crc kubenswrapper[4812]: I0131 04:38:27.347682 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"9704d8e63e1ee7713ca2211601363ad904c2c785b891ce71802b9d0e0c1e1040"} Jan 31 04:38:28 crc kubenswrapper[4812]: I0131 04:38:28.357009 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"cd7e548c06a9fa2d7c5fd60829e92f83a024a44c23f2e9dce0aa80208ebf48a2"} Jan 31 04:38:30 crc kubenswrapper[4812]: I0131 04:38:30.374422 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"9c682e7736a0b15d557b6aec0cb2ba7bf08c44bd707bc7427d26590207738717"} Jan 31 04:38:32 crc kubenswrapper[4812]: I0131 04:38:32.386153 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" event={"ID":"fe7a1e2f-8997-483f-a57e-4c699b99be50","Type":"ContainerStarted","Data":"5f576f43aed4d8b3664111b23b4bb0b25bb8d3cf1ce8aafccb4c33bb563bfcbf"} Jan 31 04:38:32 crc kubenswrapper[4812]: I0131 04:38:32.386425 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:32 crc kubenswrapper[4812]: I0131 04:38:32.415827 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:32 crc kubenswrapper[4812]: I0131 04:38:32.420068 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" podStartSLOduration=8.420048667 podStartE2EDuration="8.420048667s" podCreationTimestamp="2026-01-31 04:38:24 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:38:32.417360865 +0000 UTC m=+720.912382530" watchObservedRunningTime="2026-01-31 04:38:32.420048667 +0000 UTC m=+720.915070332" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.390239 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.390587 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.416492 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.841948 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6"] Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.842892 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.846486 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-marketplace"/"default-dockercfg-vmwhc" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.863609 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6"] Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.882554 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pd6vs\" (UniqueName: \"kubernetes.io/projected/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-kube-api-access-pd6vs\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.882622 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.882700 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.984020 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.984219 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pd6vs\" (UniqueName: \"kubernetes.io/projected/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-kube-api-access-pd6vs\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.984303 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.984942 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-util\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:33 crc kubenswrapper[4812]: I0131 04:38:33.985196 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-bundle\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: I0131 04:38:34.009983 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pd6vs\" (UniqueName: \"kubernetes.io/projected/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-kube-api-access-pd6vs\") pod \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: I0131 04:38:34.175286 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.199110 4812 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(82e438889e56790fd33936ef143c94490a9806367b99b60c3e2a7dd1dd0596c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.200088 4812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(82e438889e56790fd33936ef143c94490a9806367b99b60c3e2a7dd1dd0596c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.200153 4812 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(82e438889e56790fd33936ef143c94490a9806367b99b60c3e2a7dd1dd0596c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.200239 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace(3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace(3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(82e438889e56790fd33936ef143c94490a9806367b99b60c3e2a7dd1dd0596c6): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" Jan 31 04:38:34 crc kubenswrapper[4812]: I0131 04:38:34.394977 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: I0131 04:38:34.396138 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.421855 4812 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(bff3ac9da4aba7f87cb87422776d14b5aaa03842c8dedef0e1f7edec0322b21f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.421928 4812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(bff3ac9da4aba7f87cb87422776d14b5aaa03842c8dedef0e1f7edec0322b21f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.421962 4812 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(bff3ac9da4aba7f87cb87422776d14b5aaa03842c8dedef0e1f7edec0322b21f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:34 crc kubenswrapper[4812]: E0131 04:38:34.422016 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace(3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace(3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(bff3ac9da4aba7f87cb87422776d14b5aaa03842c8dedef0e1f7edec0322b21f): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" Jan 31 04:38:35 crc kubenswrapper[4812]: I0131 04:38:35.339984 4812 scope.go:117] "RemoveContainer" containerID="ec3f00b03424a296f7cefb61c7c1482bfba10c00049acb0b2119850f63d75f9d" Jan 31 04:38:35 crc kubenswrapper[4812]: E0131 04:38:35.340562 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"kube-multus\" with CrashLoopBackOff: \"back-off 20s restarting failed container=kube-multus pod=multus-pnwcx_openshift-multus(6050f642-2492-4f83-a739-ac905c409b8c)\"" pod="openshift-multus/multus-pnwcx" podUID="6050f642-2492-4f83-a739-ac905c409b8c" Jan 31 04:38:48 crc kubenswrapper[4812]: I0131 04:38:48.340404 4812 scope.go:117] "RemoveContainer" containerID="ec3f00b03424a296f7cefb61c7c1482bfba10c00049acb0b2119850f63d75f9d" Jan 31 04:38:49 crc kubenswrapper[4812]: I0131 04:38:49.339134 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:49 crc kubenswrapper[4812]: I0131 04:38:49.340220 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:49 crc kubenswrapper[4812]: E0131 04:38:49.380192 4812 log.go:32] "RunPodSandbox from runtime service failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(62f990e582c22aaa0c101245924e1c4b8670b6c64a9b8fd5b253c42ede2e8857): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" Jan 31 04:38:49 crc kubenswrapper[4812]: E0131 04:38:49.380313 4812 kuberuntime_sandbox.go:72] "Failed to create sandbox for pod" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(62f990e582c22aaa0c101245924e1c4b8670b6c64a9b8fd5b253c42ede2e8857): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:49 crc kubenswrapper[4812]: E0131 04:38:49.380366 4812 kuberuntime_manager.go:1170] "CreatePodSandbox for pod failed" err="rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(62f990e582c22aaa0c101245924e1c4b8670b6c64a9b8fd5b253c42ede2e8857): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:38:49 crc kubenswrapper[4812]: E0131 04:38:49.380480 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"CreatePodSandbox\" for \"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace(3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a)\" with CreatePodSandboxError: \"Failed to create sandbox for pod \\\"270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace(3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a)\\\": rpc error: code = Unknown desc = failed to create pod network sandbox k8s_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_openshift-marketplace_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a_0(62f990e582c22aaa0c101245924e1c4b8670b6c64a9b8fd5b253c42ede2e8857): no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?\"" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" Jan 31 04:38:49 crc kubenswrapper[4812]: I0131 04:38:49.495295 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-pnwcx_6050f642-2492-4f83-a739-ac905c409b8c/kube-multus/2.log" Jan 31 04:38:49 crc kubenswrapper[4812]: I0131 04:38:49.495362 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-pnwcx" event={"ID":"6050f642-2492-4f83-a739-ac905c409b8c","Type":"ContainerStarted","Data":"a3742157352f26a7f53f0595c594d331a15352eefd92e37f2a1f10f75dea6225"} Jan 31 04:38:55 crc kubenswrapper[4812]: I0131 04:38:55.384634 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-b4djn" Jan 31 04:39:04 crc kubenswrapper[4812]: I0131 04:39:04.339020 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:39:04 crc kubenswrapper[4812]: I0131 04:39:04.340107 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:39:06 crc kubenswrapper[4812]: I0131 04:39:06.345508 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6"] Jan 31 04:39:06 crc kubenswrapper[4812]: I0131 04:39:06.607147 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" event={"ID":"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a","Type":"ContainerStarted","Data":"f7e8e65d6f0008bf6909c5e12b17e296db50e7281c0bbbeafaffcf31ecb260e9"} Jan 31 04:39:08 crc kubenswrapper[4812]: I0131 04:39:08.231074 4812 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Jan 31 04:39:08 crc kubenswrapper[4812]: I0131 04:39:08.620591 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" event={"ID":"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a","Type":"ContainerStarted","Data":"662a877342f26b43daed783aafe38eef38fa3d024047934675e20b419c549808"} Jan 31 04:39:09 crc kubenswrapper[4812]: I0131 04:39:09.628423 4812 generic.go:334] "Generic (PLEG): container finished" podID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerID="662a877342f26b43daed783aafe38eef38fa3d024047934675e20b419c549808" exitCode=0 Jan 31 04:39:09 crc kubenswrapper[4812]: I0131 04:39:09.628505 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" event={"ID":"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a","Type":"ContainerDied","Data":"662a877342f26b43daed783aafe38eef38fa3d024047934675e20b419c549808"} Jan 31 04:39:09 crc kubenswrapper[4812]: I0131 04:39:09.630870 4812 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 04:39:11 crc kubenswrapper[4812]: I0131 04:39:11.647036 4812 generic.go:334] "Generic (PLEG): container finished" podID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerID="56313eae509f404b909b423e792484cdc3ffd7e0b199d3f28419d5924ae267b6" exitCode=0 Jan 31 04:39:11 crc kubenswrapper[4812]: I0131 04:39:11.647222 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" event={"ID":"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a","Type":"ContainerDied","Data":"56313eae509f404b909b423e792484cdc3ffd7e0b199d3f28419d5924ae267b6"} Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.034793 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-k9mrf"] Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.036505 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.052689 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9mrf"] Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.136381 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p5rsg\" (UniqueName: \"kubernetes.io/projected/4069389d-1df5-4643-a076-509e7361bb60-kube-api-access-p5rsg\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.136440 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-catalog-content\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.136489 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-utilities\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.237947 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p5rsg\" (UniqueName: \"kubernetes.io/projected/4069389d-1df5-4643-a076-509e7361bb60-kube-api-access-p5rsg\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.238484 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-catalog-content\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.238636 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-utilities\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.239234 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-catalog-content\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.239280 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-utilities\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.277125 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p5rsg\" (UniqueName: \"kubernetes.io/projected/4069389d-1df5-4643-a076-509e7361bb60-kube-api-access-p5rsg\") pod \"redhat-operators-k9mrf\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.407655 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.630828 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-k9mrf"] Jan 31 04:39:12 crc kubenswrapper[4812]: W0131 04:39:12.640979 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4069389d_1df5_4643_a076_509e7361bb60.slice/crio-e3b7fbbdf74c0bbf4d05120091a24d520f62ebb0e7fb07ecadfcdd37318eaf7f WatchSource:0}: Error finding container e3b7fbbdf74c0bbf4d05120091a24d520f62ebb0e7fb07ecadfcdd37318eaf7f: Status 404 returned error can't find the container with id e3b7fbbdf74c0bbf4d05120091a24d520f62ebb0e7fb07ecadfcdd37318eaf7f Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.652764 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerStarted","Data":"e3b7fbbdf74c0bbf4d05120091a24d520f62ebb0e7fb07ecadfcdd37318eaf7f"} Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.654306 4812 generic.go:334] "Generic (PLEG): container finished" podID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerID="985e81565e01c1181aa0b1365066ece7f1a3ac91d1736defe8642422b15894b0" exitCode=0 Jan 31 04:39:12 crc kubenswrapper[4812]: I0131 04:39:12.654331 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" event={"ID":"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a","Type":"ContainerDied","Data":"985e81565e01c1181aa0b1365066ece7f1a3ac91d1736defe8642422b15894b0"} Jan 31 04:39:13 crc kubenswrapper[4812]: I0131 04:39:13.664297 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerStarted","Data":"747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf"} Jan 31 04:39:13 crc kubenswrapper[4812]: I0131 04:39:13.980497 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.164991 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pd6vs\" (UniqueName: \"kubernetes.io/projected/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-kube-api-access-pd6vs\") pod \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.165191 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-bundle\") pod \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.165271 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-util\") pod \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\" (UID: \"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a\") " Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.166913 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-bundle" (OuterVolumeSpecName: "bundle") pod "3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" (UID: "3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.173141 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-kube-api-access-pd6vs" (OuterVolumeSpecName: "kube-api-access-pd6vs") pod "3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" (UID: "3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a"). InnerVolumeSpecName "kube-api-access-pd6vs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.266889 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.266961 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pd6vs\" (UniqueName: \"kubernetes.io/projected/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-kube-api-access-pd6vs\") on node \"crc\" DevicePath \"\"" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.338528 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.338636 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.375032 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-util" (OuterVolumeSpecName: "util") pod "3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" (UID: "3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.469994 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.674928 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" event={"ID":"3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a","Type":"ContainerDied","Data":"f7e8e65d6f0008bf6909c5e12b17e296db50e7281c0bbbeafaffcf31ecb260e9"} Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.674989 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f7e8e65d6f0008bf6909c5e12b17e296db50e7281c0bbbeafaffcf31ecb260e9" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.675106 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6" Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.678216 4812 generic.go:334] "Generic (PLEG): container finished" podID="4069389d-1df5-4643-a076-509e7361bb60" containerID="747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf" exitCode=0 Jan 31 04:39:14 crc kubenswrapper[4812]: I0131 04:39:14.678269 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerDied","Data":"747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf"} Jan 31 04:39:15 crc kubenswrapper[4812]: I0131 04:39:15.687034 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerStarted","Data":"661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7"} Jan 31 04:39:16 crc kubenswrapper[4812]: I0131 04:39:16.695349 4812 generic.go:334] "Generic (PLEG): container finished" podID="4069389d-1df5-4643-a076-509e7361bb60" containerID="661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7" exitCode=0 Jan 31 04:39:16 crc kubenswrapper[4812]: I0131 04:39:16.695413 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerDied","Data":"661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7"} Jan 31 04:39:17 crc kubenswrapper[4812]: I0131 04:39:17.708062 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerStarted","Data":"8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e"} Jan 31 04:39:22 crc kubenswrapper[4812]: I0131 04:39:22.408078 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:22 crc kubenswrapper[4812]: I0131 04:39:22.408359 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:23 crc kubenswrapper[4812]: I0131 04:39:23.459682 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-k9mrf" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="registry-server" probeResult="failure" output=< Jan 31 04:39:23 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:39:23 crc kubenswrapper[4812]: > Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.389985 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-k9mrf" podStartSLOduration=11.938343774 podStartE2EDuration="14.389968848s" podCreationTimestamp="2026-01-31 04:39:12 +0000 UTC" firstStartedPulling="2026-01-31 04:39:14.680375544 +0000 UTC m=+763.175397239" lastFinishedPulling="2026-01-31 04:39:17.132000628 +0000 UTC m=+765.627022313" observedRunningTime="2026-01-31 04:39:17.737063536 +0000 UTC m=+766.232085251" watchObservedRunningTime="2026-01-31 04:39:26.389968848 +0000 UTC m=+774.884990513" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.393831 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2"] Jan 31 04:39:26 crc kubenswrapper[4812]: E0131 04:39:26.394128 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="extract" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.394155 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="extract" Jan 31 04:39:26 crc kubenswrapper[4812]: E0131 04:39:26.394184 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="util" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.394196 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="util" Jan 31 04:39:26 crc kubenswrapper[4812]: E0131 04:39:26.394222 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="pull" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.394233 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="pull" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.394378 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a" containerName="extract" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.394935 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.397543 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-controller-manager-service-cert" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.398236 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"manager-account-dockercfg-tmftx" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.398429 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-cert" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.398977 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"openshift-service-ca.crt" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.399029 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"kube-root-ca.crt" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.413477 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2"] Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.541424 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-apiservice-cert\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.541478 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-webhook-cert\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.541616 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xlng\" (UniqueName: \"kubernetes.io/projected/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-kube-api-access-8xlng\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.643274 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8xlng\" (UniqueName: \"kubernetes.io/projected/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-kube-api-access-8xlng\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.643563 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-apiservice-cert\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.643644 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-webhook-cert\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.649915 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-webhook-cert\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.659339 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-apiservice-cert\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.661366 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xlng\" (UniqueName: \"kubernetes.io/projected/3e6043c0-61f1-4cb1-a1df-056d81c22ea0-kube-api-access-8xlng\") pod \"metallb-operator-controller-manager-8458b697d8-qwvb2\" (UID: \"3e6043c0-61f1-4cb1-a1df-056d81c22ea0\") " pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.709404 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.715758 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6"] Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.716537 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.718336 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-dockercfg-l42br" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.718558 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.718723 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-operator-webhook-server-service-cert" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.728214 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6"] Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.847772 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59wbw\" (UniqueName: \"kubernetes.io/projected/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-kube-api-access-59wbw\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.848168 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-webhook-cert\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.848219 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-apiservice-cert\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.948871 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-apiservice-cert\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.948929 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-59wbw\" (UniqueName: \"kubernetes.io/projected/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-kube-api-access-59wbw\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.948967 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-webhook-cert\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.954938 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-apiservice-cert\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.956678 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-webhook-cert\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.972813 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2"] Jan 31 04:39:26 crc kubenswrapper[4812]: I0131 04:39:26.974558 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-59wbw\" (UniqueName: \"kubernetes.io/projected/dfe9ef2d-1841-414a-9645-84d15e3fa9e5-kube-api-access-59wbw\") pod \"metallb-operator-webhook-server-56b4c6c8df-wjtg6\" (UID: \"dfe9ef2d-1841-414a-9645-84d15e3fa9e5\") " pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:27 crc kubenswrapper[4812]: I0131 04:39:27.079653 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:27 crc kubenswrapper[4812]: I0131 04:39:27.250665 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6"] Jan 31 04:39:27 crc kubenswrapper[4812]: W0131 04:39:27.256954 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poddfe9ef2d_1841_414a_9645_84d15e3fa9e5.slice/crio-0afe8fcf7780f7e0370269ef88043bfa251c3322cf5f8dc11b2d27fe314bfeb9 WatchSource:0}: Error finding container 0afe8fcf7780f7e0370269ef88043bfa251c3322cf5f8dc11b2d27fe314bfeb9: Status 404 returned error can't find the container with id 0afe8fcf7780f7e0370269ef88043bfa251c3322cf5f8dc11b2d27fe314bfeb9 Jan 31 04:39:27 crc kubenswrapper[4812]: I0131 04:39:27.761888 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" event={"ID":"dfe9ef2d-1841-414a-9645-84d15e3fa9e5","Type":"ContainerStarted","Data":"0afe8fcf7780f7e0370269ef88043bfa251c3322cf5f8dc11b2d27fe314bfeb9"} Jan 31 04:39:27 crc kubenswrapper[4812]: I0131 04:39:27.763273 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" event={"ID":"3e6043c0-61f1-4cb1-a1df-056d81c22ea0","Type":"ContainerStarted","Data":"50def0850207c1a3c59ef21d3c1c62ffaee63609d04f8372fd2165bdb13eb433"} Jan 31 04:39:30 crc kubenswrapper[4812]: I0131 04:39:30.785728 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" event={"ID":"3e6043c0-61f1-4cb1-a1df-056d81c22ea0","Type":"ContainerStarted","Data":"5e1cc60351428347ac14b77da7c09dd22073dc9ee9426ea1e2c9c4a5aa56ff09"} Jan 31 04:39:30 crc kubenswrapper[4812]: I0131 04:39:30.787208 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:39:30 crc kubenswrapper[4812]: I0131 04:39:30.809889 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" podStartSLOduration=1.771992847 podStartE2EDuration="4.809867635s" podCreationTimestamp="2026-01-31 04:39:26 +0000 UTC" firstStartedPulling="2026-01-31 04:39:26.967470049 +0000 UTC m=+775.462491714" lastFinishedPulling="2026-01-31 04:39:30.005344837 +0000 UTC m=+778.500366502" observedRunningTime="2026-01-31 04:39:30.806736412 +0000 UTC m=+779.301758097" watchObservedRunningTime="2026-01-31 04:39:30.809867635 +0000 UTC m=+779.304889320" Jan 31 04:39:32 crc kubenswrapper[4812]: I0131 04:39:32.447227 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:32 crc kubenswrapper[4812]: I0131 04:39:32.485425 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:32 crc kubenswrapper[4812]: I0131 04:39:32.671935 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9mrf"] Jan 31 04:39:33 crc kubenswrapper[4812]: I0131 04:39:33.803029 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" event={"ID":"dfe9ef2d-1841-414a-9645-84d15e3fa9e5","Type":"ContainerStarted","Data":"970f5a413bc304fbc0b8fec88b6be769b778e0a66513dccfcae63796bd4e8d1b"} Jan 31 04:39:33 crc kubenswrapper[4812]: I0131 04:39:33.803489 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:39:33 crc kubenswrapper[4812]: I0131 04:39:33.803187 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-k9mrf" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="registry-server" containerID="cri-o://8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e" gracePeriod=2 Jan 31 04:39:33 crc kubenswrapper[4812]: I0131 04:39:33.826908 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" podStartSLOduration=1.820620849 podStartE2EDuration="7.826892956s" podCreationTimestamp="2026-01-31 04:39:26 +0000 UTC" firstStartedPulling="2026-01-31 04:39:27.260070662 +0000 UTC m=+775.755092327" lastFinishedPulling="2026-01-31 04:39:33.266342719 +0000 UTC m=+781.761364434" observedRunningTime="2026-01-31 04:39:33.82371208 +0000 UTC m=+782.318733745" watchObservedRunningTime="2026-01-31 04:39:33.826892956 +0000 UTC m=+782.321914621" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.202599 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.345907 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p5rsg\" (UniqueName: \"kubernetes.io/projected/4069389d-1df5-4643-a076-509e7361bb60-kube-api-access-p5rsg\") pod \"4069389d-1df5-4643-a076-509e7361bb60\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.345967 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-catalog-content\") pod \"4069389d-1df5-4643-a076-509e7361bb60\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.346082 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-utilities\") pod \"4069389d-1df5-4643-a076-509e7361bb60\" (UID: \"4069389d-1df5-4643-a076-509e7361bb60\") " Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.347223 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-utilities" (OuterVolumeSpecName: "utilities") pod "4069389d-1df5-4643-a076-509e7361bb60" (UID: "4069389d-1df5-4643-a076-509e7361bb60"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.347326 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.356955 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4069389d-1df5-4643-a076-509e7361bb60-kube-api-access-p5rsg" (OuterVolumeSpecName: "kube-api-access-p5rsg") pod "4069389d-1df5-4643-a076-509e7361bb60" (UID: "4069389d-1df5-4643-a076-509e7361bb60"). InnerVolumeSpecName "kube-api-access-p5rsg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.448412 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p5rsg\" (UniqueName: \"kubernetes.io/projected/4069389d-1df5-4643-a076-509e7361bb60-kube-api-access-p5rsg\") on node \"crc\" DevicePath \"\"" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.494703 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "4069389d-1df5-4643-a076-509e7361bb60" (UID: "4069389d-1df5-4643-a076-509e7361bb60"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.550042 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/4069389d-1df5-4643-a076-509e7361bb60-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.809704 4812 generic.go:334] "Generic (PLEG): container finished" podID="4069389d-1df5-4643-a076-509e7361bb60" containerID="8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e" exitCode=0 Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.809784 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-k9mrf" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.809775 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerDied","Data":"8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e"} Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.810174 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-k9mrf" event={"ID":"4069389d-1df5-4643-a076-509e7361bb60","Type":"ContainerDied","Data":"e3b7fbbdf74c0bbf4d05120091a24d520f62ebb0e7fb07ecadfcdd37318eaf7f"} Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.810205 4812 scope.go:117] "RemoveContainer" containerID="8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.843467 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-k9mrf"] Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.850542 4812 scope.go:117] "RemoveContainer" containerID="661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.854475 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-k9mrf"] Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.874535 4812 scope.go:117] "RemoveContainer" containerID="747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.894870 4812 scope.go:117] "RemoveContainer" containerID="8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e" Jan 31 04:39:34 crc kubenswrapper[4812]: E0131 04:39:34.895389 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e\": container with ID starting with 8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e not found: ID does not exist" containerID="8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.895451 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e"} err="failed to get container status \"8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e\": rpc error: code = NotFound desc = could not find container \"8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e\": container with ID starting with 8fcc77daef81d34c629d52e7e6dd5150d6fbbb637bf75bf1cf3474bb0f6b566e not found: ID does not exist" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.895491 4812 scope.go:117] "RemoveContainer" containerID="661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7" Jan 31 04:39:34 crc kubenswrapper[4812]: E0131 04:39:34.895863 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7\": container with ID starting with 661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7 not found: ID does not exist" containerID="661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.895921 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7"} err="failed to get container status \"661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7\": rpc error: code = NotFound desc = could not find container \"661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7\": container with ID starting with 661cf2c37607a6f71e5b140628671232334ccad48a7726f13f53c852b02d21a7 not found: ID does not exist" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.895957 4812 scope.go:117] "RemoveContainer" containerID="747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf" Jan 31 04:39:34 crc kubenswrapper[4812]: E0131 04:39:34.896501 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf\": container with ID starting with 747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf not found: ID does not exist" containerID="747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf" Jan 31 04:39:34 crc kubenswrapper[4812]: I0131 04:39:34.896562 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf"} err="failed to get container status \"747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf\": rpc error: code = NotFound desc = could not find container \"747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf\": container with ID starting with 747567be6d77aef8139fcda4cdb8e35c6347ecb8371254e080c0720c38a746cf not found: ID does not exist" Jan 31 04:39:36 crc kubenswrapper[4812]: I0131 04:39:36.347296 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4069389d-1df5-4643-a076-509e7361bb60" path="/var/lib/kubelet/pods/4069389d-1df5-4643-a076-509e7361bb60/volumes" Jan 31 04:39:44 crc kubenswrapper[4812]: I0131 04:39:44.338395 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:39:44 crc kubenswrapper[4812]: I0131 04:39:44.338771 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:39:47 crc kubenswrapper[4812]: I0131 04:39:47.089597 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-webhook-server-56b4c6c8df-wjtg6" Jan 31 04:40:06 crc kubenswrapper[4812]: I0131 04:40:06.715448 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/metallb-operator-controller-manager-8458b697d8-qwvb2" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.536137 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-7qjt7"] Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.536368 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="extract-utilities" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.536383 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="extract-utilities" Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.536404 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="extract-content" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.536413 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="extract-content" Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.536430 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="registry-server" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.536438 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="registry-server" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.536590 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4069389d-1df5-4643-a076-509e7361bb60" containerName="registry-server" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.545628 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.552737 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"frr-startup" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.553022 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-certs-secret" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.556001 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-daemon-dockercfg-t66h6" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.561283 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff"] Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.562882 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.565451 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"frr-k8s-webhook-server-cert" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.569831 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff"] Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.617867 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/speaker-pnrw7"] Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618316 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-conf\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618390 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-sockets\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618429 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7g6\" (UniqueName: \"kubernetes.io/projected/7d26d631-eae7-43cb-9df7-ef994fbb752d-kube-api-access-4v7g6\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618477 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-reloader\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618498 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wncrp\" (UniqueName: \"kubernetes.io/projected/c8c1c067-45c3-4fc8-b656-920d058691ee-kube-api-access-wncrp\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618535 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-startup\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618554 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-metrics\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618581 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c1c067-45c3-4fc8-b656-920d058691ee-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.618615 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d26d631-eae7-43cb-9df7-ef994fbb752d-metrics-certs\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.619223 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.622808 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-dockercfg-47rmf" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.623186 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-memberlist" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.623387 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"metallb-system"/"metallb-excludel2" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.623801 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"speaker-certs-secret" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.631731 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["metallb-system/controller-6968d8fdc4-hldjr"] Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.633571 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.641197 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"controller-certs-secret" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.666166 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-hldjr"] Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.719814 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-sockets\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.719927 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9sw9s\" (UniqueName: \"kubernetes.io/projected/b0653050-2c1a-48f2-9f1b-10ccd0366143-kube-api-access-9sw9s\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.719957 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-metrics-certs\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.719980 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7g6\" (UniqueName: \"kubernetes.io/projected/7d26d631-eae7-43cb-9df7-ef994fbb752d-kube-api-access-4v7g6\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720001 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-metrics-certs\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720034 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-reloader\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720057 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9874\" (UniqueName: \"kubernetes.io/projected/b7380313-059d-4437-a3dc-371ce0a51fc3-kube-api-access-g9874\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720083 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wncrp\" (UniqueName: \"kubernetes.io/projected/c8c1c067-45c3-4fc8-b656-920d058691ee-kube-api-access-wncrp\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720124 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-cert\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720144 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720169 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-startup\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720191 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-metrics\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720216 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b0653050-2c1a-48f2-9f1b-10ccd0366143-metallb-excludel2\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720247 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c1c067-45c3-4fc8-b656-920d058691ee-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720274 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d26d631-eae7-43cb-9df7-ef994fbb752d-metrics-certs\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720299 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-conf\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720442 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-sockets\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-sockets\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.720451 4812 secret.go:188] Couldn't get secret metallb-system/frr-k8s-webhook-server-cert: secret "frr-k8s-webhook-server-cert" not found Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.720612 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c8c1c067-45c3-4fc8-b656-920d058691ee-cert podName:c8c1c067-45c3-4fc8-b656-920d058691ee nodeName:}" failed. No retries permitted until 2026-01-31 04:40:08.220587841 +0000 UTC m=+816.715609506 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c8c1c067-45c3-4fc8-b656-920d058691ee-cert") pod "frr-k8s-webhook-server-7df86c4f6c-kctff" (UID: "c8c1c067-45c3-4fc8-b656-920d058691ee") : secret "frr-k8s-webhook-server-cert" not found Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.720955 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-metrics\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.721177 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"reloader\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-reloader\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.721237 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-startup\" (UniqueName: \"kubernetes.io/configmap/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-startup\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.721378 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"frr-conf\" (UniqueName: \"kubernetes.io/empty-dir/7d26d631-eae7-43cb-9df7-ef994fbb752d-frr-conf\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.726224 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/7d26d631-eae7-43cb-9df7-ef994fbb752d-metrics-certs\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.739690 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wncrp\" (UniqueName: \"kubernetes.io/projected/c8c1c067-45c3-4fc8-b656-920d058691ee-kube-api-access-wncrp\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.741573 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7g6\" (UniqueName: \"kubernetes.io/projected/7d26d631-eae7-43cb-9df7-ef994fbb752d-kube-api-access-4v7g6\") pod \"frr-k8s-7qjt7\" (UID: \"7d26d631-eae7-43cb-9df7-ef994fbb752d\") " pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.821929 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9sw9s\" (UniqueName: \"kubernetes.io/projected/b0653050-2c1a-48f2-9f1b-10ccd0366143-kube-api-access-9sw9s\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.821976 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-metrics-certs\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.821995 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-metrics-certs\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.822022 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-g9874\" (UniqueName: \"kubernetes.io/projected/b7380313-059d-4437-a3dc-371ce0a51fc3-kube-api-access-g9874\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.822042 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-cert\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.822055 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.822082 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b0653050-2c1a-48f2-9f1b-10ccd0366143-metallb-excludel2\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.822731 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metallb-excludel2\" (UniqueName: \"kubernetes.io/configmap/b0653050-2c1a-48f2-9f1b-10ccd0366143-metallb-excludel2\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.822464 4812 secret.go:188] Couldn't get secret metallb-system/controller-certs-secret: secret "controller-certs-secret" not found Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.822945 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-metrics-certs podName:b7380313-059d-4437-a3dc-371ce0a51fc3 nodeName:}" failed. No retries permitted until 2026-01-31 04:40:08.32292663 +0000 UTC m=+816.817948295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-metrics-certs") pod "controller-6968d8fdc4-hldjr" (UID: "b7380313-059d-4437-a3dc-371ce0a51fc3") : secret "controller-certs-secret" not found Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.822608 4812 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 04:40:07 crc kubenswrapper[4812]: E0131 04:40:07.823071 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist podName:b0653050-2c1a-48f2-9f1b-10ccd0366143 nodeName:}" failed. No retries permitted until 2026-01-31 04:40:08.323046853 +0000 UTC m=+816.818068518 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist") pod "speaker-pnrw7" (UID: "b0653050-2c1a-48f2-9f1b-10ccd0366143") : secret "metallb-memberlist" not found Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.826000 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-metrics-certs\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.826254 4812 reflector.go:368] Caches populated for *v1.Secret from object-"metallb-system"/"metallb-webhook-cert" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.836265 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-cert\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.839877 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9874\" (UniqueName: \"kubernetes.io/projected/b7380313-059d-4437-a3dc-371ce0a51fc3-kube-api-access-g9874\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.852627 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9sw9s\" (UniqueName: \"kubernetes.io/projected/b0653050-2c1a-48f2-9f1b-10ccd0366143-kube-api-access-9sw9s\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:07 crc kubenswrapper[4812]: I0131 04:40:07.872774 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.225946 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c1c067-45c3-4fc8-b656-920d058691ee-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.231792 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c8c1c067-45c3-4fc8-b656-920d058691ee-cert\") pod \"frr-k8s-webhook-server-7df86c4f6c-kctff\" (UID: \"c8c1c067-45c3-4fc8-b656-920d058691ee\") " pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.327134 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-metrics-certs\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.327256 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:08 crc kubenswrapper[4812]: E0131 04:40:08.327451 4812 secret.go:188] Couldn't get secret metallb-system/metallb-memberlist: secret "metallb-memberlist" not found Jan 31 04:40:08 crc kubenswrapper[4812]: E0131 04:40:08.327524 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist podName:b0653050-2c1a-48f2-9f1b-10ccd0366143 nodeName:}" failed. No retries permitted until 2026-01-31 04:40:09.327505408 +0000 UTC m=+817.822527083 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "memberlist" (UniqueName: "kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist") pod "speaker-pnrw7" (UID: "b0653050-2c1a-48f2-9f1b-10ccd0366143") : secret "metallb-memberlist" not found Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.332485 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/b7380313-059d-4437-a3dc-371ce0a51fc3-metrics-certs\") pod \"controller-6968d8fdc4-hldjr\" (UID: \"b7380313-059d-4437-a3dc-371ce0a51fc3\") " pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.492770 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.547538 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.797529 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff"] Jan 31 04:40:08 crc kubenswrapper[4812]: W0131 04:40:08.802064 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc8c1c067_45c3_4fc8_b656_920d058691ee.slice/crio-99d6cbb86bf77c0c51559ab14bdaebf5b090f028876ee9325d69c7368ec9d25f WatchSource:0}: Error finding container 99d6cbb86bf77c0c51559ab14bdaebf5b090f028876ee9325d69c7368ec9d25f: Status 404 returned error can't find the container with id 99d6cbb86bf77c0c51559ab14bdaebf5b090f028876ee9325d69c7368ec9d25f Jan 31 04:40:08 crc kubenswrapper[4812]: I0131 04:40:08.839296 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["metallb-system/controller-6968d8fdc4-hldjr"] Jan 31 04:40:08 crc kubenswrapper[4812]: W0131 04:40:08.846981 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7380313_059d_4437_a3dc_371ce0a51fc3.slice/crio-2b705ad09d153814eeb171eaed29ee54577a25f7c9fd369c46eef78c68171066 WatchSource:0}: Error finding container 2b705ad09d153814eeb171eaed29ee54577a25f7c9fd369c46eef78c68171066: Status 404 returned error can't find the container with id 2b705ad09d153814eeb171eaed29ee54577a25f7c9fd369c46eef78c68171066 Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.021988 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" event={"ID":"c8c1c067-45c3-4fc8-b656-920d058691ee","Type":"ContainerStarted","Data":"99d6cbb86bf77c0c51559ab14bdaebf5b090f028876ee9325d69c7368ec9d25f"} Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.023915 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"3d622bc48805127740b51827a5f3b4771532b49c23caed76af384b77408ec3cd"} Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.025555 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-hldjr" event={"ID":"b7380313-059d-4437-a3dc-371ce0a51fc3","Type":"ContainerStarted","Data":"4924bf1b13b87f07685b258c9ab7a3457e75d9483575c82baac05f2faf440677"} Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.025590 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-hldjr" event={"ID":"b7380313-059d-4437-a3dc-371ce0a51fc3","Type":"ContainerStarted","Data":"2b705ad09d153814eeb171eaed29ee54577a25f7c9fd369c46eef78c68171066"} Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.342069 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.349527 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"memberlist\" (UniqueName: \"kubernetes.io/secret/b0653050-2c1a-48f2-9f1b-10ccd0366143-memberlist\") pod \"speaker-pnrw7\" (UID: \"b0653050-2c1a-48f2-9f1b-10ccd0366143\") " pod="metallb-system/speaker-pnrw7" Jan 31 04:40:09 crc kubenswrapper[4812]: I0131 04:40:09.433362 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="metallb-system/speaker-pnrw7" Jan 31 04:40:09 crc kubenswrapper[4812]: W0131 04:40:09.467726 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb0653050_2c1a_48f2_9f1b_10ccd0366143.slice/crio-de136bfc000338c0f21d596931f7b9b6950b19fde99a337211ac278247ca4be3 WatchSource:0}: Error finding container de136bfc000338c0f21d596931f7b9b6950b19fde99a337211ac278247ca4be3: Status 404 returned error can't find the container with id de136bfc000338c0f21d596931f7b9b6950b19fde99a337211ac278247ca4be3 Jan 31 04:40:10 crc kubenswrapper[4812]: I0131 04:40:10.033436 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pnrw7" event={"ID":"b0653050-2c1a-48f2-9f1b-10ccd0366143","Type":"ContainerStarted","Data":"dc37212f73dce87889d7b740c4cf54f38f4725fecd1f34ce9d5d9c96f3582300"} Jan 31 04:40:10 crc kubenswrapper[4812]: I0131 04:40:10.033684 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pnrw7" event={"ID":"b0653050-2c1a-48f2-9f1b-10ccd0366143","Type":"ContainerStarted","Data":"de136bfc000338c0f21d596931f7b9b6950b19fde99a337211ac278247ca4be3"} Jan 31 04:40:13 crc kubenswrapper[4812]: I0131 04:40:13.065598 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/speaker-pnrw7" event={"ID":"b0653050-2c1a-48f2-9f1b-10ccd0366143","Type":"ContainerStarted","Data":"9fb2f6ba8e61c6fc598ce6cc2a7680cbd3ddc9cfba8ed5143123048859f85e3d"} Jan 31 04:40:13 crc kubenswrapper[4812]: I0131 04:40:13.066037 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/speaker-pnrw7" Jan 31 04:40:13 crc kubenswrapper[4812]: I0131 04:40:13.068332 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/controller-6968d8fdc4-hldjr" event={"ID":"b7380313-059d-4437-a3dc-371ce0a51fc3","Type":"ContainerStarted","Data":"371cca13660649d335abdba5afab5b85699e664be53d2e79bae4c0b1a6df951f"} Jan 31 04:40:13 crc kubenswrapper[4812]: I0131 04:40:13.068501 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:13 crc kubenswrapper[4812]: I0131 04:40:13.083281 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/speaker-pnrw7" podStartSLOduration=3.455837547 podStartE2EDuration="6.083262677s" podCreationTimestamp="2026-01-31 04:40:07 +0000 UTC" firstStartedPulling="2026-01-31 04:40:09.968177492 +0000 UTC m=+818.463199157" lastFinishedPulling="2026-01-31 04:40:12.595602622 +0000 UTC m=+821.090624287" observedRunningTime="2026-01-31 04:40:13.078642934 +0000 UTC m=+821.573664629" watchObservedRunningTime="2026-01-31 04:40:13.083262677 +0000 UTC m=+821.578284342" Jan 31 04:40:13 crc kubenswrapper[4812]: I0131 04:40:13.099088 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/controller-6968d8fdc4-hldjr" podStartSLOduration=2.502882645 podStartE2EDuration="6.09907005s" podCreationTimestamp="2026-01-31 04:40:07 +0000 UTC" firstStartedPulling="2026-01-31 04:40:08.991572137 +0000 UTC m=+817.486593822" lastFinishedPulling="2026-01-31 04:40:12.587759562 +0000 UTC m=+821.082781227" observedRunningTime="2026-01-31 04:40:13.095650929 +0000 UTC m=+821.590672594" watchObservedRunningTime="2026-01-31 04:40:13.09907005 +0000 UTC m=+821.594091715" Jan 31 04:40:14 crc kubenswrapper[4812]: I0131 04:40:14.337962 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:40:14 crc kubenswrapper[4812]: I0131 04:40:14.338017 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:40:14 crc kubenswrapper[4812]: I0131 04:40:14.338063 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:40:14 crc kubenswrapper[4812]: I0131 04:40:14.338592 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"8eec189e5f64e5907eb85688b79b50a9ecc03fb99c8c2ed8b673292e41a75382"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:40:14 crc kubenswrapper[4812]: I0131 04:40:14.338652 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://8eec189e5f64e5907eb85688b79b50a9ecc03fb99c8c2ed8b673292e41a75382" gracePeriod=600 Jan 31 04:40:15 crc kubenswrapper[4812]: I0131 04:40:15.085713 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="8eec189e5f64e5907eb85688b79b50a9ecc03fb99c8c2ed8b673292e41a75382" exitCode=0 Jan 31 04:40:15 crc kubenswrapper[4812]: I0131 04:40:15.085762 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"8eec189e5f64e5907eb85688b79b50a9ecc03fb99c8c2ed8b673292e41a75382"} Jan 31 04:40:15 crc kubenswrapper[4812]: I0131 04:40:15.085800 4812 scope.go:117] "RemoveContainer" containerID="a3a494ee07f97dfd2482ef89bc2ad119ec04f37ba57a9282577848803151a65d" Jan 31 04:40:17 crc kubenswrapper[4812]: I0131 04:40:17.110404 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" event={"ID":"c8c1c067-45c3-4fc8-b656-920d058691ee","Type":"ContainerStarted","Data":"0c1ed31d2181066523fb0c174c3819983796283ed1aada51436a4a7e6f0d1ae8"} Jan 31 04:40:17 crc kubenswrapper[4812]: I0131 04:40:17.110746 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:17 crc kubenswrapper[4812]: I0131 04:40:17.112614 4812 generic.go:334] "Generic (PLEG): container finished" podID="7d26d631-eae7-43cb-9df7-ef994fbb752d" containerID="e40f123c4fda21921c92ac5b53d4687dd4bde8edd897c4cd74eae5d019e595d8" exitCode=0 Jan 31 04:40:17 crc kubenswrapper[4812]: I0131 04:40:17.112690 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerDied","Data":"e40f123c4fda21921c92ac5b53d4687dd4bde8edd897c4cd74eae5d019e595d8"} Jan 31 04:40:17 crc kubenswrapper[4812]: I0131 04:40:17.115794 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"1cec58e47f0a7d8b5dd4b0d6f39c2bf781f061b2945554dc684cf3bd87d589d3"} Jan 31 04:40:17 crc kubenswrapper[4812]: I0131 04:40:17.132326 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" podStartSLOduration=2.15646855 podStartE2EDuration="10.132302426s" podCreationTimestamp="2026-01-31 04:40:07 +0000 UTC" firstStartedPulling="2026-01-31 04:40:08.80605399 +0000 UTC m=+817.301075645" lastFinishedPulling="2026-01-31 04:40:16.781887846 +0000 UTC m=+825.276909521" observedRunningTime="2026-01-31 04:40:17.124313232 +0000 UTC m=+825.619334937" watchObservedRunningTime="2026-01-31 04:40:17.132302426 +0000 UTC m=+825.627324121" Jan 31 04:40:18 crc kubenswrapper[4812]: I0131 04:40:18.127417 4812 generic.go:334] "Generic (PLEG): container finished" podID="7d26d631-eae7-43cb-9df7-ef994fbb752d" containerID="c6cde509c1a24ace24d2885aed0ba900f84694891913a96cf780b95fdc13193b" exitCode=0 Jan 31 04:40:18 crc kubenswrapper[4812]: I0131 04:40:18.127471 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerDied","Data":"c6cde509c1a24ace24d2885aed0ba900f84694891913a96cf780b95fdc13193b"} Jan 31 04:40:18 crc kubenswrapper[4812]: I0131 04:40:18.553499 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/controller-6968d8fdc4-hldjr" Jan 31 04:40:19 crc kubenswrapper[4812]: I0131 04:40:19.139243 4812 generic.go:334] "Generic (PLEG): container finished" podID="7d26d631-eae7-43cb-9df7-ef994fbb752d" containerID="df2a1b03ca545cceb4eca9215bea539545a0e15680432c7f69905d02457e3baf" exitCode=0 Jan 31 04:40:19 crc kubenswrapper[4812]: I0131 04:40:19.139550 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerDied","Data":"df2a1b03ca545cceb4eca9215bea539545a0e15680432c7f69905d02457e3baf"} Jan 31 04:40:19 crc kubenswrapper[4812]: I0131 04:40:19.439036 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/speaker-pnrw7" Jan 31 04:40:20 crc kubenswrapper[4812]: I0131 04:40:20.149254 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"1090be6bf55108c0e062bf778cfbc6f45ea03f71216b24e8dcbcf1dc786bcae7"} Jan 31 04:40:20 crc kubenswrapper[4812]: I0131 04:40:20.149299 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"220328cfcaf16a5cda38738ceb48486e05fb16f53d4d6f21b49cd0fd5b825151"} Jan 31 04:40:20 crc kubenswrapper[4812]: I0131 04:40:20.149314 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"9da8d077bc0d1ad5839cfd520c68d00f348c7b85cd20128c327282a5ceefa812"} Jan 31 04:40:21 crc kubenswrapper[4812]: I0131 04:40:21.167408 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"c73a2fc769259ccb02af63bbe4012a5e412c40bb41227eae7a14b436b797220f"} Jan 31 04:40:21 crc kubenswrapper[4812]: I0131 04:40:21.168726 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"6ec34c13d403d757f59ce20af44ef6cece2335dba49ca87d9e1bbc3fd3a0211a"} Jan 31 04:40:21 crc kubenswrapper[4812]: I0131 04:40:21.168816 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:21 crc kubenswrapper[4812]: I0131 04:40:21.168918 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="metallb-system/frr-k8s-7qjt7" event={"ID":"7d26d631-eae7-43cb-9df7-ef994fbb752d","Type":"ContainerStarted","Data":"d48e0ab23712e6791b727a3abfc9233960080b00f9f7319dc023739dd8796085"} Jan 31 04:40:21 crc kubenswrapper[4812]: I0131 04:40:21.191614 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="metallb-system/frr-k8s-7qjt7" podStartSLOduration=5.489675406 podStartE2EDuration="14.1915949s" podCreationTimestamp="2026-01-31 04:40:07 +0000 UTC" firstStartedPulling="2026-01-31 04:40:08.04927348 +0000 UTC m=+816.544295165" lastFinishedPulling="2026-01-31 04:40:16.751192984 +0000 UTC m=+825.246214659" observedRunningTime="2026-01-31 04:40:21.187278754 +0000 UTC m=+829.682300459" watchObservedRunningTime="2026-01-31 04:40:21.1915949 +0000 UTC m=+829.686616565" Jan 31 04:40:22 crc kubenswrapper[4812]: I0131 04:40:22.874372 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:22 crc kubenswrapper[4812]: I0131 04:40:22.940490 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:24 crc kubenswrapper[4812]: I0131 04:40:24.925815 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-h7wjp"] Jan 31 04:40:24 crc kubenswrapper[4812]: I0131 04:40:24.927387 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:24 crc kubenswrapper[4812]: I0131 04:40:24.929518 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-index-dockercfg-jzzqd" Jan 31 04:40:24 crc kubenswrapper[4812]: I0131 04:40:24.929613 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"kube-root-ca.crt" Jan 31 04:40:24 crc kubenswrapper[4812]: I0131 04:40:24.931776 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openstack-operators"/"openshift-service-ca.crt" Jan 31 04:40:24 crc kubenswrapper[4812]: I0131 04:40:24.968182 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h7wjp"] Jan 31 04:40:25 crc kubenswrapper[4812]: I0131 04:40:25.004234 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhs27\" (UniqueName: \"kubernetes.io/projected/1eff129a-b493-4a47-9428-c9acfda8cbc2-kube-api-access-nhs27\") pod \"mariadb-operator-index-h7wjp\" (UID: \"1eff129a-b493-4a47-9428-c9acfda8cbc2\") " pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:25 crc kubenswrapper[4812]: I0131 04:40:25.105395 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nhs27\" (UniqueName: \"kubernetes.io/projected/1eff129a-b493-4a47-9428-c9acfda8cbc2-kube-api-access-nhs27\") pod \"mariadb-operator-index-h7wjp\" (UID: \"1eff129a-b493-4a47-9428-c9acfda8cbc2\") " pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:25 crc kubenswrapper[4812]: I0131 04:40:25.127590 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhs27\" (UniqueName: \"kubernetes.io/projected/1eff129a-b493-4a47-9428-c9acfda8cbc2-kube-api-access-nhs27\") pod \"mariadb-operator-index-h7wjp\" (UID: \"1eff129a-b493-4a47-9428-c9acfda8cbc2\") " pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:25 crc kubenswrapper[4812]: I0131 04:40:25.257777 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:25 crc kubenswrapper[4812]: I0131 04:40:25.524151 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-h7wjp"] Jan 31 04:40:25 crc kubenswrapper[4812]: W0131 04:40:25.532106 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1eff129a_b493_4a47_9428_c9acfda8cbc2.slice/crio-0fb3c9a526945a4f6fb9248a9f8efa23db6b685e06aed0749838134bbd8038ea WatchSource:0}: Error finding container 0fb3c9a526945a4f6fb9248a9f8efa23db6b685e06aed0749838134bbd8038ea: Status 404 returned error can't find the container with id 0fb3c9a526945a4f6fb9248a9f8efa23db6b685e06aed0749838134bbd8038ea Jan 31 04:40:26 crc kubenswrapper[4812]: I0131 04:40:26.244372 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h7wjp" event={"ID":"1eff129a-b493-4a47-9428-c9acfda8cbc2","Type":"ContainerStarted","Data":"0fb3c9a526945a4f6fb9248a9f8efa23db6b685e06aed0749838134bbd8038ea"} Jan 31 04:40:27 crc kubenswrapper[4812]: I0131 04:40:27.250136 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h7wjp" event={"ID":"1eff129a-b493-4a47-9428-c9acfda8cbc2","Type":"ContainerStarted","Data":"1e6c7d15185d509e8338b5dae03a2fd441cdb49586da4ef8b3ed49ea480eca08"} Jan 31 04:40:27 crc kubenswrapper[4812]: I0131 04:40:27.265272 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-h7wjp" podStartSLOduration=2.481313533 podStartE2EDuration="3.26525406s" podCreationTimestamp="2026-01-31 04:40:24 +0000 UTC" firstStartedPulling="2026-01-31 04:40:25.533563751 +0000 UTC m=+834.028585416" lastFinishedPulling="2026-01-31 04:40:26.317504248 +0000 UTC m=+834.812525943" observedRunningTime="2026-01-31 04:40:27.262116175 +0000 UTC m=+835.757137850" watchObservedRunningTime="2026-01-31 04:40:27.26525406 +0000 UTC m=+835.760275725" Jan 31 04:40:28 crc kubenswrapper[4812]: I0131 04:40:28.303894 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-h7wjp"] Jan 31 04:40:28 crc kubenswrapper[4812]: I0131 04:40:28.498955 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-webhook-server-7df86c4f6c-kctff" Jan 31 04:40:28 crc kubenswrapper[4812]: I0131 04:40:28.915611 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-index-788ws"] Jan 31 04:40:28 crc kubenswrapper[4812]: I0131 04:40:28.920482 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:28 crc kubenswrapper[4812]: I0131 04:40:28.937769 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-788ws"] Jan 31 04:40:29 crc kubenswrapper[4812]: I0131 04:40:29.061262 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l4sjd\" (UniqueName: \"kubernetes.io/projected/65cc13cf-2366-4670-ac64-27e8ffa38afc-kube-api-access-l4sjd\") pod \"mariadb-operator-index-788ws\" (UID: \"65cc13cf-2366-4670-ac64-27e8ffa38afc\") " pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:29 crc kubenswrapper[4812]: I0131 04:40:29.163416 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-l4sjd\" (UniqueName: \"kubernetes.io/projected/65cc13cf-2366-4670-ac64-27e8ffa38afc-kube-api-access-l4sjd\") pod \"mariadb-operator-index-788ws\" (UID: \"65cc13cf-2366-4670-ac64-27e8ffa38afc\") " pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:29 crc kubenswrapper[4812]: I0131 04:40:29.191035 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-l4sjd\" (UniqueName: \"kubernetes.io/projected/65cc13cf-2366-4670-ac64-27e8ffa38afc-kube-api-access-l4sjd\") pod \"mariadb-operator-index-788ws\" (UID: \"65cc13cf-2366-4670-ac64-27e8ffa38afc\") " pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:29 crc kubenswrapper[4812]: I0131 04:40:29.266389 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-h7wjp" podUID="1eff129a-b493-4a47-9428-c9acfda8cbc2" containerName="registry-server" containerID="cri-o://1e6c7d15185d509e8338b5dae03a2fd441cdb49586da4ef8b3ed49ea480eca08" gracePeriod=2 Jan 31 04:40:29 crc kubenswrapper[4812]: I0131 04:40:29.282228 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:29 crc kubenswrapper[4812]: I0131 04:40:29.718481 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-index-788ws"] Jan 31 04:40:30 crc kubenswrapper[4812]: I0131 04:40:30.275611 4812 generic.go:334] "Generic (PLEG): container finished" podID="1eff129a-b493-4a47-9428-c9acfda8cbc2" containerID="1e6c7d15185d509e8338b5dae03a2fd441cdb49586da4ef8b3ed49ea480eca08" exitCode=0 Jan 31 04:40:30 crc kubenswrapper[4812]: I0131 04:40:30.275732 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h7wjp" event={"ID":"1eff129a-b493-4a47-9428-c9acfda8cbc2","Type":"ContainerDied","Data":"1e6c7d15185d509e8338b5dae03a2fd441cdb49586da4ef8b3ed49ea480eca08"} Jan 31 04:40:30 crc kubenswrapper[4812]: I0131 04:40:30.277248 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-788ws" event={"ID":"65cc13cf-2366-4670-ac64-27e8ffa38afc","Type":"ContainerStarted","Data":"cbda4954c4e0132714bc4c7decdd804416e8cf10437cfa25e87402b6d9237f2a"} Jan 31 04:40:31 crc kubenswrapper[4812]: I0131 04:40:31.709240 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:31 crc kubenswrapper[4812]: I0131 04:40:31.805100 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhs27\" (UniqueName: \"kubernetes.io/projected/1eff129a-b493-4a47-9428-c9acfda8cbc2-kube-api-access-nhs27\") pod \"1eff129a-b493-4a47-9428-c9acfda8cbc2\" (UID: \"1eff129a-b493-4a47-9428-c9acfda8cbc2\") " Jan 31 04:40:31 crc kubenswrapper[4812]: I0131 04:40:31.814091 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1eff129a-b493-4a47-9428-c9acfda8cbc2-kube-api-access-nhs27" (OuterVolumeSpecName: "kube-api-access-nhs27") pod "1eff129a-b493-4a47-9428-c9acfda8cbc2" (UID: "1eff129a-b493-4a47-9428-c9acfda8cbc2"). InnerVolumeSpecName "kube-api-access-nhs27". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:40:31 crc kubenswrapper[4812]: I0131 04:40:31.906062 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nhs27\" (UniqueName: \"kubernetes.io/projected/1eff129a-b493-4a47-9428-c9acfda8cbc2-kube-api-access-nhs27\") on node \"crc\" DevicePath \"\"" Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.291180 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-h7wjp" event={"ID":"1eff129a-b493-4a47-9428-c9acfda8cbc2","Type":"ContainerDied","Data":"0fb3c9a526945a4f6fb9248a9f8efa23db6b685e06aed0749838134bbd8038ea"} Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.291208 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-h7wjp" Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.291266 4812 scope.go:117] "RemoveContainer" containerID="1e6c7d15185d509e8338b5dae03a2fd441cdb49586da4ef8b3ed49ea480eca08" Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.292738 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-788ws" event={"ID":"65cc13cf-2366-4670-ac64-27e8ffa38afc","Type":"ContainerStarted","Data":"2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d"} Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.314823 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-index-788ws" podStartSLOduration=2.508715832 podStartE2EDuration="4.314802618s" podCreationTimestamp="2026-01-31 04:40:28 +0000 UTC" firstStartedPulling="2026-01-31 04:40:29.730931434 +0000 UTC m=+838.225953099" lastFinishedPulling="2026-01-31 04:40:31.53701817 +0000 UTC m=+840.032039885" observedRunningTime="2026-01-31 04:40:32.309034891 +0000 UTC m=+840.804056556" watchObservedRunningTime="2026-01-31 04:40:32.314802618 +0000 UTC m=+840.809824283" Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.334687 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-h7wjp"] Jan 31 04:40:32 crc kubenswrapper[4812]: I0131 04:40:32.347791 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-h7wjp"] Jan 31 04:40:34 crc kubenswrapper[4812]: I0131 04:40:34.349559 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1eff129a-b493-4a47-9428-c9acfda8cbc2" path="/var/lib/kubelet/pods/1eff129a-b493-4a47-9428-c9acfda8cbc2/volumes" Jan 31 04:40:37 crc kubenswrapper[4812]: I0131 04:40:37.877829 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="metallb-system/frr-k8s-7qjt7" Jan 31 04:40:39 crc kubenswrapper[4812]: I0131 04:40:39.283092 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:39 crc kubenswrapper[4812]: I0131 04:40:39.283571 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:39 crc kubenswrapper[4812]: I0131 04:40:39.313143 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:39 crc kubenswrapper[4812]: I0131 04:40:39.373156 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.108823 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t"] Jan 31 04:40:46 crc kubenswrapper[4812]: E0131 04:40:46.109970 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1eff129a-b493-4a47-9428-c9acfda8cbc2" containerName="registry-server" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.109997 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1eff129a-b493-4a47-9428-c9acfda8cbc2" containerName="registry-server" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.110202 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1eff129a-b493-4a47-9428-c9acfda8cbc2" containerName="registry-server" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.111565 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.115317 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sklpv" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.127889 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t"] Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.226739 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.226820 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk2cf\" (UniqueName: \"kubernetes.io/projected/118ced32-01e9-43b6-b6d4-8588fd59d6e7-kube-api-access-rk2cf\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.226911 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.328718 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.328875 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rk2cf\" (UniqueName: \"kubernetes.io/projected/118ced32-01e9-43b6-b6d4-8588fd59d6e7-kube-api-access-rk2cf\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.328979 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.329529 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-util\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.329659 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-bundle\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.359395 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk2cf\" (UniqueName: \"kubernetes.io/projected/118ced32-01e9-43b6-b6d4-8588fd59d6e7-kube-api-access-rk2cf\") pod \"f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.498051 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:46 crc kubenswrapper[4812]: I0131 04:40:46.765299 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t"] Jan 31 04:40:47 crc kubenswrapper[4812]: I0131 04:40:47.436538 4812 generic.go:334] "Generic (PLEG): container finished" podID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerID="3b03ecdb0a91647103c8746672aec791c70952ee73965531c1ff88b92f63261b" exitCode=0 Jan 31 04:40:47 crc kubenswrapper[4812]: I0131 04:40:47.436589 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" event={"ID":"118ced32-01e9-43b6-b6d4-8588fd59d6e7","Type":"ContainerDied","Data":"3b03ecdb0a91647103c8746672aec791c70952ee73965531c1ff88b92f63261b"} Jan 31 04:40:47 crc kubenswrapper[4812]: I0131 04:40:47.436809 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" event={"ID":"118ced32-01e9-43b6-b6d4-8588fd59d6e7","Type":"ContainerStarted","Data":"c838ee2c71277470f017de70338d7cd37cd3da96037d9414ff3825af8933f2c1"} Jan 31 04:40:49 crc kubenswrapper[4812]: I0131 04:40:49.456371 4812 generic.go:334] "Generic (PLEG): container finished" podID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerID="26e8194d212ca6552a12c0a8ebb634d9aea1baa5d3b43df5ea1bddef68cbde4a" exitCode=0 Jan 31 04:40:49 crc kubenswrapper[4812]: I0131 04:40:49.456900 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" event={"ID":"118ced32-01e9-43b6-b6d4-8588fd59d6e7","Type":"ContainerDied","Data":"26e8194d212ca6552a12c0a8ebb634d9aea1baa5d3b43df5ea1bddef68cbde4a"} Jan 31 04:40:50 crc kubenswrapper[4812]: I0131 04:40:50.465032 4812 generic.go:334] "Generic (PLEG): container finished" podID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerID="67c8fa1c0270baf5a6ebce83741ac7c6978656eb2b375db434b740a6e59fde8b" exitCode=0 Jan 31 04:40:50 crc kubenswrapper[4812]: I0131 04:40:50.465091 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" event={"ID":"118ced32-01e9-43b6-b6d4-8588fd59d6e7","Type":"ContainerDied","Data":"67c8fa1c0270baf5a6ebce83741ac7c6978656eb2b375db434b740a6e59fde8b"} Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.823373 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.910111 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rk2cf\" (UniqueName: \"kubernetes.io/projected/118ced32-01e9-43b6-b6d4-8588fd59d6e7-kube-api-access-rk2cf\") pod \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.910615 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-util\") pod \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.910652 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-bundle\") pod \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\" (UID: \"118ced32-01e9-43b6-b6d4-8588fd59d6e7\") " Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.912550 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-bundle" (OuterVolumeSpecName: "bundle") pod "118ced32-01e9-43b6-b6d4-8588fd59d6e7" (UID: "118ced32-01e9-43b6-b6d4-8588fd59d6e7"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.918232 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/118ced32-01e9-43b6-b6d4-8588fd59d6e7-kube-api-access-rk2cf" (OuterVolumeSpecName: "kube-api-access-rk2cf") pod "118ced32-01e9-43b6-b6d4-8588fd59d6e7" (UID: "118ced32-01e9-43b6-b6d4-8588fd59d6e7"). InnerVolumeSpecName "kube-api-access-rk2cf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:40:51 crc kubenswrapper[4812]: I0131 04:40:51.941027 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-util" (OuterVolumeSpecName: "util") pod "118ced32-01e9-43b6-b6d4-8588fd59d6e7" (UID: "118ced32-01e9-43b6-b6d4-8588fd59d6e7"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:40:52 crc kubenswrapper[4812]: I0131 04:40:52.012309 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rk2cf\" (UniqueName: \"kubernetes.io/projected/118ced32-01e9-43b6-b6d4-8588fd59d6e7-kube-api-access-rk2cf\") on node \"crc\" DevicePath \"\"" Jan 31 04:40:52 crc kubenswrapper[4812]: I0131 04:40:52.012358 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:40:52 crc kubenswrapper[4812]: I0131 04:40:52.012378 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/118ced32-01e9-43b6-b6d4-8588fd59d6e7-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:40:52 crc kubenswrapper[4812]: I0131 04:40:52.500609 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" event={"ID":"118ced32-01e9-43b6-b6d4-8588fd59d6e7","Type":"ContainerDied","Data":"c838ee2c71277470f017de70338d7cd37cd3da96037d9414ff3825af8933f2c1"} Jan 31 04:40:52 crc kubenswrapper[4812]: I0131 04:40:52.500684 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t" Jan 31 04:40:52 crc kubenswrapper[4812]: I0131 04:40:52.500692 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c838ee2c71277470f017de70338d7cd37cd3da96037d9414ff3825af8933f2c1" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.128624 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4"] Jan 31 04:40:59 crc kubenswrapper[4812]: E0131 04:40:59.129612 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="util" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.129634 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="util" Jan 31 04:40:59 crc kubenswrapper[4812]: E0131 04:40:59.129660 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="pull" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.129671 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="pull" Jan 31 04:40:59 crc kubenswrapper[4812]: E0131 04:40:59.129694 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="extract" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.129705 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="extract" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.129875 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" containerName="extract" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.130441 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.134383 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-service-cert" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.135872 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"mariadb-operator-controller-manager-dockercfg-hw48z" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.136128 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"webhook-server-cert" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.141870 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4"] Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.221206 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7s9jw\" (UniqueName: \"kubernetes.io/projected/c2d7a051-0941-42a6-82dc-76cfa73c185d-kube-api-access-7s9jw\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.221308 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-webhook-cert\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.221333 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-apiservice-cert\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.322087 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-webhook-cert\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.322127 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-apiservice-cert\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.322158 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7s9jw\" (UniqueName: \"kubernetes.io/projected/c2d7a051-0941-42a6-82dc-76cfa73c185d-kube-api-access-7s9jw\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.331631 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-apiservice-cert\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.338963 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-webhook-cert\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.339464 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7s9jw\" (UniqueName: \"kubernetes.io/projected/c2d7a051-0941-42a6-82dc-76cfa73c185d-kube-api-access-7s9jw\") pod \"mariadb-operator-controller-manager-75bc68fcbf-78zf4\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.453751 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:40:59 crc kubenswrapper[4812]: I0131 04:40:59.844237 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4"] Jan 31 04:41:00 crc kubenswrapper[4812]: I0131 04:41:00.572909 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" event={"ID":"c2d7a051-0941-42a6-82dc-76cfa73c185d","Type":"ContainerStarted","Data":"0218d70c63fa48fdfa752970a4cbdc30d48b28f6d1bebbfdec8db8e3db950e40"} Jan 31 04:41:03 crc kubenswrapper[4812]: I0131 04:41:03.598222 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" event={"ID":"c2d7a051-0941-42a6-82dc-76cfa73c185d","Type":"ContainerStarted","Data":"0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4"} Jan 31 04:41:03 crc kubenswrapper[4812]: I0131 04:41:03.598652 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:41:03 crc kubenswrapper[4812]: I0131 04:41:03.623788 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" podStartSLOduration=1.633371962 podStartE2EDuration="4.623763139s" podCreationTimestamp="2026-01-31 04:40:59 +0000 UTC" firstStartedPulling="2026-01-31 04:40:59.860366441 +0000 UTC m=+868.355388106" lastFinishedPulling="2026-01-31 04:41:02.850757618 +0000 UTC m=+871.345779283" observedRunningTime="2026-01-31 04:41:03.617411798 +0000 UTC m=+872.112433483" watchObservedRunningTime="2026-01-31 04:41:03.623763139 +0000 UTC m=+872.118784824" Jan 31 04:41:09 crc kubenswrapper[4812]: I0131 04:41:09.458135 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.570023 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-index-f6vmn"] Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.571175 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.573279 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-index-dockercfg-x5ljk" Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.581672 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-f6vmn"] Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.704332 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qgddp\" (UniqueName: \"kubernetes.io/projected/0771cbbe-eeee-435e-8740-edab15c2484c-kube-api-access-qgddp\") pod \"infra-operator-index-f6vmn\" (UID: \"0771cbbe-eeee-435e-8740-edab15c2484c\") " pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.805908 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qgddp\" (UniqueName: \"kubernetes.io/projected/0771cbbe-eeee-435e-8740-edab15c2484c-kube-api-access-qgddp\") pod \"infra-operator-index-f6vmn\" (UID: \"0771cbbe-eeee-435e-8740-edab15c2484c\") " pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.824974 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qgddp\" (UniqueName: \"kubernetes.io/projected/0771cbbe-eeee-435e-8740-edab15c2484c-kube-api-access-qgddp\") pod \"infra-operator-index-f6vmn\" (UID: \"0771cbbe-eeee-435e-8740-edab15c2484c\") " pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:12 crc kubenswrapper[4812]: I0131 04:41:12.892283 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:13 crc kubenswrapper[4812]: I0131 04:41:13.359908 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-index-f6vmn"] Jan 31 04:41:13 crc kubenswrapper[4812]: W0131 04:41:13.368111 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0771cbbe_eeee_435e_8740_edab15c2484c.slice/crio-b505813ddc93efbe6cf41064deef7bb997932980b152cf0847e816e89732c279 WatchSource:0}: Error finding container b505813ddc93efbe6cf41064deef7bb997932980b152cf0847e816e89732c279: Status 404 returned error can't find the container with id b505813ddc93efbe6cf41064deef7bb997932980b152cf0847e816e89732c279 Jan 31 04:41:13 crc kubenswrapper[4812]: I0131 04:41:13.676557 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-f6vmn" event={"ID":"0771cbbe-eeee-435e-8740-edab15c2484c","Type":"ContainerStarted","Data":"b505813ddc93efbe6cf41064deef7bb997932980b152cf0847e816e89732c279"} Jan 31 04:41:14 crc kubenswrapper[4812]: I0131 04:41:14.684500 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-f6vmn" event={"ID":"0771cbbe-eeee-435e-8740-edab15c2484c","Type":"ContainerStarted","Data":"cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005"} Jan 31 04:41:14 crc kubenswrapper[4812]: I0131 04:41:14.702237 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-index-f6vmn" podStartSLOduration=2.106980001 podStartE2EDuration="2.70221567s" podCreationTimestamp="2026-01-31 04:41:12 +0000 UTC" firstStartedPulling="2026-01-31 04:41:13.371312103 +0000 UTC m=+881.866333798" lastFinishedPulling="2026-01-31 04:41:13.966547802 +0000 UTC m=+882.461569467" observedRunningTime="2026-01-31 04:41:14.699635011 +0000 UTC m=+883.194656686" watchObservedRunningTime="2026-01-31 04:41:14.70221567 +0000 UTC m=+883.197237345" Jan 31 04:41:22 crc kubenswrapper[4812]: I0131 04:41:22.893148 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:22 crc kubenswrapper[4812]: I0131 04:41:22.893905 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:22 crc kubenswrapper[4812]: I0131 04:41:22.935207 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:23 crc kubenswrapper[4812]: I0131 04:41:23.797211 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.410931 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977"] Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.412381 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.416018 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sklpv" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.424378 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.424590 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9n4pg\" (UniqueName: \"kubernetes.io/projected/7b0f44c0-ce3b-44a9-b345-4143577af2f2-kube-api-access-9n4pg\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.424686 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.428792 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977"] Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.525738 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9n4pg\" (UniqueName: \"kubernetes.io/projected/7b0f44c0-ce3b-44a9-b345-4143577af2f2-kube-api-access-9n4pg\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.526145 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.526243 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.526961 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-util\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.527168 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-bundle\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.550082 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9n4pg\" (UniqueName: \"kubernetes.io/projected/7b0f44c0-ce3b-44a9-b345-4143577af2f2-kube-api-access-9n4pg\") pod \"d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:25 crc kubenswrapper[4812]: I0131 04:41:25.751935 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:26 crc kubenswrapper[4812]: I0131 04:41:26.239147 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977"] Jan 31 04:41:26 crc kubenswrapper[4812]: I0131 04:41:26.783989 4812 generic.go:334] "Generic (PLEG): container finished" podID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerID="274ccd42723a6b9e53535318ebc8580110e93c4297bd453f71f72ba737c3345b" exitCode=0 Jan 31 04:41:26 crc kubenswrapper[4812]: I0131 04:41:26.784029 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" event={"ID":"7b0f44c0-ce3b-44a9-b345-4143577af2f2","Type":"ContainerDied","Data":"274ccd42723a6b9e53535318ebc8580110e93c4297bd453f71f72ba737c3345b"} Jan 31 04:41:26 crc kubenswrapper[4812]: I0131 04:41:26.784051 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" event={"ID":"7b0f44c0-ce3b-44a9-b345-4143577af2f2","Type":"ContainerStarted","Data":"ac53f169762335f29f7843c9903f688d082a0ab78cb5719f170b649c8b2b041d"} Jan 31 04:41:27 crc kubenswrapper[4812]: I0131 04:41:27.795807 4812 generic.go:334] "Generic (PLEG): container finished" podID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerID="3f710dfb38b60bf2b2117b69bfc90b0c7546143908158cf1565f087060a0d5f4" exitCode=0 Jan 31 04:41:27 crc kubenswrapper[4812]: I0131 04:41:27.795953 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" event={"ID":"7b0f44c0-ce3b-44a9-b345-4143577af2f2","Type":"ContainerDied","Data":"3f710dfb38b60bf2b2117b69bfc90b0c7546143908158cf1565f087060a0d5f4"} Jan 31 04:41:28 crc kubenswrapper[4812]: I0131 04:41:28.808351 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" event={"ID":"7b0f44c0-ce3b-44a9-b345-4143577af2f2","Type":"ContainerDied","Data":"38dc490856d8d7a2a51dc77ce5da33b29275b1d6d807677c6f1424d59ff987aa"} Jan 31 04:41:28 crc kubenswrapper[4812]: I0131 04:41:28.808181 4812 generic.go:334] "Generic (PLEG): container finished" podID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerID="38dc490856d8d7a2a51dc77ce5da33b29275b1d6d807677c6f1424d59ff987aa" exitCode=0 Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.107594 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.296784 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-bundle\") pod \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.296872 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-util\") pod \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.297134 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9n4pg\" (UniqueName: \"kubernetes.io/projected/7b0f44c0-ce3b-44a9-b345-4143577af2f2-kube-api-access-9n4pg\") pod \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\" (UID: \"7b0f44c0-ce3b-44a9-b345-4143577af2f2\") " Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.301722 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-bundle" (OuterVolumeSpecName: "bundle") pod "7b0f44c0-ce3b-44a9-b345-4143577af2f2" (UID: "7b0f44c0-ce3b-44a9-b345-4143577af2f2"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.305027 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7b0f44c0-ce3b-44a9-b345-4143577af2f2-kube-api-access-9n4pg" (OuterVolumeSpecName: "kube-api-access-9n4pg") pod "7b0f44c0-ce3b-44a9-b345-4143577af2f2" (UID: "7b0f44c0-ce3b-44a9-b345-4143577af2f2"). InnerVolumeSpecName "kube-api-access-9n4pg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.334646 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-util" (OuterVolumeSpecName: "util") pod "7b0f44c0-ce3b-44a9-b345-4143577af2f2" (UID: "7b0f44c0-ce3b-44a9-b345-4143577af2f2"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.398759 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9n4pg\" (UniqueName: \"kubernetes.io/projected/7b0f44c0-ce3b-44a9-b345-4143577af2f2-kube-api-access-9n4pg\") on node \"crc\" DevicePath \"\"" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.398805 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.398824 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/7b0f44c0-ce3b-44a9-b345-4143577af2f2-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.825878 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" event={"ID":"7b0f44c0-ce3b-44a9-b345-4143577af2f2","Type":"ContainerDied","Data":"ac53f169762335f29f7843c9903f688d082a0ab78cb5719f170b649c8b2b041d"} Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.825933 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ac53f169762335f29f7843c9903f688d082a0ab78cb5719f170b649c8b2b041d" Jan 31 04:41:30 crc kubenswrapper[4812]: I0131 04:41:30.826020 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.547785 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv"] Jan 31 04:41:42 crc kubenswrapper[4812]: E0131 04:41:42.548540 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="extract" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.548553 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="extract" Jan 31 04:41:42 crc kubenswrapper[4812]: E0131 04:41:42.548563 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="util" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.548570 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="util" Jan 31 04:41:42 crc kubenswrapper[4812]: E0131 04:41:42.548578 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="pull" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.548584 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="pull" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.548678 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" containerName="extract" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.549069 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.550919 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-service-cert" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.553123 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"infra-operator-controller-manager-dockercfg-t969r" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.562329 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv"] Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.666876 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dlh54\" (UniqueName: \"kubernetes.io/projected/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-kube-api-access-dlh54\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.666931 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-webhook-cert\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.666979 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-apiservice-cert\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.768141 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dlh54\" (UniqueName: \"kubernetes.io/projected/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-kube-api-access-dlh54\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.768189 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-webhook-cert\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.768227 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-apiservice-cert\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.774644 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-webhook-cert\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.774691 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-apiservice-cert\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.796726 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dlh54\" (UniqueName: \"kubernetes.io/projected/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-kube-api-access-dlh54\") pod \"infra-operator-controller-manager-748684b5b6-cr6xv\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:42 crc kubenswrapper[4812]: I0131 04:41:42.863818 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:43 crc kubenswrapper[4812]: I0131 04:41:43.161076 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv"] Jan 31 04:41:43 crc kubenswrapper[4812]: I0131 04:41:43.922447 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" event={"ID":"d72ee50b-10a1-4e23-b0eb-ec5227c4f740","Type":"ContainerStarted","Data":"712ebe9f5aab441d766abb60305028907417676d719d86b026bc227c9c13f1a0"} Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.787948 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.788732 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: W0131 04:41:44.790571 4812 reflector.go:561] object-"glance-kuttl-tests"/"openstack-config-data": failed to list *v1.ConfigMap: configmaps "openstack-config-data" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Jan 31 04:41:44 crc kubenswrapper[4812]: E0131 04:41:44.790614 4812 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"openstack-config-data\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-config-data\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 04:41:44 crc kubenswrapper[4812]: W0131 04:41:44.790937 4812 reflector.go:561] object-"glance-kuttl-tests"/"kube-root-ca.crt": failed to list *v1.ConfigMap: configmaps "kube-root-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Jan 31 04:41:44 crc kubenswrapper[4812]: W0131 04:41:44.790957 4812 reflector.go:561] object-"glance-kuttl-tests"/"openshift-service-ca.crt": failed to list *v1.ConfigMap: configmaps "openshift-service-ca.crt" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Jan 31 04:41:44 crc kubenswrapper[4812]: E0131 04:41:44.790973 4812 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"kube-root-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 04:41:44 crc kubenswrapper[4812]: E0131 04:41:44.790976 4812 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"openshift-service-ca.crt\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 04:41:44 crc kubenswrapper[4812]: W0131 04:41:44.790946 4812 reflector.go:561] object-"glance-kuttl-tests"/"openstack-scripts": failed to list *v1.ConfigMap: configmaps "openstack-scripts" is forbidden: User "system:node:crc" cannot list resource "configmaps" in API group "" in the namespace "glance-kuttl-tests": no relationship found between node 'crc' and this object Jan 31 04:41:44 crc kubenswrapper[4812]: E0131 04:41:44.791005 4812 reflector.go:158] "Unhandled Error" err="object-\"glance-kuttl-tests\"/\"openstack-scripts\": Failed to watch *v1.ConfigMap: failed to list *v1.ConfigMap: configmaps \"openstack-scripts\" is forbidden: User \"system:node:crc\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"glance-kuttl-tests\": no relationship found between node 'crc' and this object" logger="UnhandledError" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.794367 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"galera-openstack-dockercfg-gtjhm" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.798575 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.799473 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.814917 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.816641 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.825425 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.836568 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.860752 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901237 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xh94m\" (UniqueName: \"kubernetes.io/projected/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kube-api-access-xh94m\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901276 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-config-data-default\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901296 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k6hlq\" (UniqueName: \"kubernetes.io/projected/efd185c8-37f1-4661-b631-524671bff15f-kube-api-access-k6hlq\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901327 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kolla-config\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901354 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901370 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901389 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901413 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901432 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-default\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901457 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-kolla-config\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901484 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-kolla-config\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901509 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-default\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901527 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901542 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/efd185c8-37f1-4661-b631-524671bff15f-config-data-generated\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901558 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901574 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwrbs\" (UniqueName: \"kubernetes.io/projected/7666fda0-373e-4936-bd6f-ea26691ad9d5-kube-api-access-dwrbs\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901589 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:44 crc kubenswrapper[4812]: I0131 04:41:44.901611 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-generated\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002633 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-kolla-config\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002684 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-default\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002727 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002746 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/efd185c8-37f1-4661-b631-524671bff15f-config-data-generated\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002764 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002799 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dwrbs\" (UniqueName: \"kubernetes.io/projected/7666fda0-373e-4936-bd6f-ea26691ad9d5-kube-api-access-dwrbs\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002818 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002860 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-generated\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002881 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xh94m\" (UniqueName: \"kubernetes.io/projected/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kube-api-access-xh94m\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002898 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-config-data-default\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002915 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k6hlq\" (UniqueName: \"kubernetes.io/projected/efd185c8-37f1-4661-b631-524671bff15f-kube-api-access-k6hlq\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002937 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kolla-config\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002959 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.002974 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003016 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003037 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003056 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-default\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003073 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-kolla-config\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003162 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") device mount path \"/mnt/openstack/pv16\"" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003162 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") device mount path \"/mnt/openstack/pv11\"" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003430 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-generated\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003482 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-generated\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.003541 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") device mount path \"/mnt/openstack/pv17\"" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.005196 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/efd185c8-37f1-4661-b631-524671bff15f-config-data-generated\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.021520 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.021648 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.021652 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.767512 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"kube-root-ca.crt" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.932756 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config-data" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.933751 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-default\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.933885 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-kolla-config\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.933751 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-config-data-default\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.934238 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kolla-config\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.934487 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-kolla-config\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.934598 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-default\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.937980 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" event={"ID":"d72ee50b-10a1-4e23-b0eb-ec5227c4f740","Type":"ContainerStarted","Data":"1aa87d438681fa7652fb4829d433806c2725d1cc2cf3fc164d33d5065858be90"} Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.938184 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:45 crc kubenswrapper[4812]: I0131 04:41:45.964528 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" podStartSLOduration=1.991238991 podStartE2EDuration="3.964510734s" podCreationTimestamp="2026-01-31 04:41:42 +0000 UTC" firstStartedPulling="2026-01-31 04:41:43.168716861 +0000 UTC m=+911.663738526" lastFinishedPulling="2026-01-31 04:41:45.141988604 +0000 UTC m=+913.637010269" observedRunningTime="2026-01-31 04:41:45.962794879 +0000 UTC m=+914.457816644" watchObservedRunningTime="2026-01-31 04:41:45.964510734 +0000 UTC m=+914.459532399" Jan 31 04:41:46 crc kubenswrapper[4812]: E0131 04:41:46.004355 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: failed to sync configmap cache: timed out waiting for the condition Jan 31 04:41:46 crc kubenswrapper[4812]: E0131 04:41:46.005153 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts podName:efd185c8-37f1-4661-b631-524671bff15f nodeName:}" failed. No retries permitted until 2026-01-31 04:41:46.505128105 +0000 UTC m=+915.000149770 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts") pod "openstack-galera-1" (UID: "efd185c8-37f1-4661-b631-524671bff15f") : failed to sync configmap cache: timed out waiting for the condition Jan 31 04:41:46 crc kubenswrapper[4812]: E0131 04:41:46.004413 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: failed to sync configmap cache: timed out waiting for the condition Jan 31 04:41:46 crc kubenswrapper[4812]: E0131 04:41:46.005329 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts podName:7666fda0-373e-4936-bd6f-ea26691ad9d5 nodeName:}" failed. No retries permitted until 2026-01-31 04:41:46.50531791 +0000 UTC m=+915.000339575 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts") pod "openstack-galera-0" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5") : failed to sync configmap cache: timed out waiting for the condition Jan 31 04:41:46 crc kubenswrapper[4812]: E0131 04:41:46.004437 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: failed to sync configmap cache: timed out waiting for the condition Jan 31 04:41:46 crc kubenswrapper[4812]: E0131 04:41:46.005484 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts podName:546d2f27-dfde-4446-978b-19b2e6a1d6a0 nodeName:}" failed. No retries permitted until 2026-01-31 04:41:46.505474064 +0000 UTC m=+915.000495729 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts") pod "openstack-galera-2" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0") : failed to sync configmap cache: timed out waiting for the condition Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.133065 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openshift-service-ca.crt" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.141670 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k6hlq\" (UniqueName: \"kubernetes.io/projected/efd185c8-37f1-4661-b631-524671bff15f-kube-api-access-k6hlq\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.141757 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xh94m\" (UniqueName: \"kubernetes.io/projected/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kube-api-access-xh94m\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.143185 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwrbs\" (UniqueName: \"kubernetes.io/projected/7666fda0-373e-4936-bd6f-ea26691ad9d5-kube-api-access-dwrbs\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.195166 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.532751 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.532804 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.532911 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.534438 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts\") pod \"openstack-galera-1\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.535323 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts\") pod \"openstack-galera-0\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.536594 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts\") pod \"openstack-galera-2\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.610264 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.627534 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.647233 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.866057 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.903334 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.944407 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"546d2f27-dfde-4446-978b-19b2e6a1d6a0","Type":"ContainerStarted","Data":"a7d02b1f60921b56dfabadefeec3379818b71014bfb53d056d0a41e39426ef72"} Jan 31 04:41:46 crc kubenswrapper[4812]: I0131 04:41:46.945390 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7666fda0-373e-4936-bd6f-ea26691ad9d5","Type":"ContainerStarted","Data":"4b50cdbdf9eea25def0c3b9e4c5dd09260e5bcf1a43a4b1045c1b3eac4ca831d"} Jan 31 04:41:47 crc kubenswrapper[4812]: I0131 04:41:47.154376 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 04:41:47 crc kubenswrapper[4812]: W0131 04:41:47.158219 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podefd185c8_37f1_4661_b631_524671bff15f.slice/crio-b56c1621e0a70d11272fb594cc08f91e48edacb4dafdf02da4f9adeefe297503 WatchSource:0}: Error finding container b56c1621e0a70d11272fb594cc08f91e48edacb4dafdf02da4f9adeefe297503: Status 404 returned error can't find the container with id b56c1621e0a70d11272fb594cc08f91e48edacb4dafdf02da4f9adeefe297503 Jan 31 04:41:47 crc kubenswrapper[4812]: I0131 04:41:47.952515 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"efd185c8-37f1-4661-b631-524671bff15f","Type":"ContainerStarted","Data":"b56c1621e0a70d11272fb594cc08f91e48edacb4dafdf02da4f9adeefe297503"} Jan 31 04:41:52 crc kubenswrapper[4812]: I0131 04:41:52.870890 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.003455 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"546d2f27-dfde-4446-978b-19b2e6a1d6a0","Type":"ContainerStarted","Data":"38e52c9c0288c0b63675bbc84f846a5149a62c293742776eaf3db9f19a358d2e"} Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.004594 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7666fda0-373e-4936-bd6f-ea26691ad9d5","Type":"ContainerStarted","Data":"719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de"} Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.006465 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"efd185c8-37f1-4661-b631-524671bff15f","Type":"ContainerStarted","Data":"4bbd1c19288ea0a64cbcbbe90be9a0d57e6900bbfeba657dc1d439e2fccb2b8d"} Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.922257 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.923208 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.924568 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"memcached-memcached-dockercfg-77rgb" Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.927981 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"memcached-config-data" Jan 31 04:41:55 crc kubenswrapper[4812]: I0131 04:41:55.932825 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.007961 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xt4q2\" (UniqueName: \"kubernetes.io/projected/014dcd69-d412-4c77-8a96-521ffc036f50-kube-api-access-xt4q2\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.008029 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-kolla-config\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.008130 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-config-data\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.111556 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-kolla-config\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.111709 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-kolla-config\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.112022 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-config-data\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.112625 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-config-data\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.113273 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xt4q2\" (UniqueName: \"kubernetes.io/projected/014dcd69-d412-4c77-8a96-521ffc036f50-kube-api-access-xt4q2\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.138641 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xt4q2\" (UniqueName: \"kubernetes.io/projected/014dcd69-d412-4c77-8a96-521ffc036f50-kube-api-access-xt4q2\") pod \"memcached-0\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.244530 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 04:41:56 crc kubenswrapper[4812]: I0131 04:41:56.704391 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.017280 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"014dcd69-d412-4c77-8a96-521ffc036f50","Type":"ContainerStarted","Data":"f74de4f57246c2edca36958d666a4a06cc8d5e40c67c651d1ed098a1061a4eb0"} Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.368419 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tg59z"] Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.370340 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.372934 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-index-dockercfg-8gqpc" Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.382197 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tg59z"] Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.534512 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bgkcw\" (UniqueName: \"kubernetes.io/projected/d74b15ea-c24b-479e-9469-ee16f0cc85f0-kube-api-access-bgkcw\") pod \"rabbitmq-cluster-operator-index-tg59z\" (UID: \"d74b15ea-c24b-479e-9469-ee16f0cc85f0\") " pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.636960 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bgkcw\" (UniqueName: \"kubernetes.io/projected/d74b15ea-c24b-479e-9469-ee16f0cc85f0-kube-api-access-bgkcw\") pod \"rabbitmq-cluster-operator-index-tg59z\" (UID: \"d74b15ea-c24b-479e-9469-ee16f0cc85f0\") " pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.653123 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bgkcw\" (UniqueName: \"kubernetes.io/projected/d74b15ea-c24b-479e-9469-ee16f0cc85f0-kube-api-access-bgkcw\") pod \"rabbitmq-cluster-operator-index-tg59z\" (UID: \"d74b15ea-c24b-479e-9469-ee16f0cc85f0\") " pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:41:57 crc kubenswrapper[4812]: I0131 04:41:57.746584 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.026555 4812 generic.go:334] "Generic (PLEG): container finished" podID="efd185c8-37f1-4661-b631-524671bff15f" containerID="4bbd1c19288ea0a64cbcbbe90be9a0d57e6900bbfeba657dc1d439e2fccb2b8d" exitCode=0 Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.028124 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"efd185c8-37f1-4661-b631-524671bff15f","Type":"ContainerDied","Data":"4bbd1c19288ea0a64cbcbbe90be9a0d57e6900bbfeba657dc1d439e2fccb2b8d"} Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.041702 4812 generic.go:334] "Generic (PLEG): container finished" podID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerID="38e52c9c0288c0b63675bbc84f846a5149a62c293742776eaf3db9f19a358d2e" exitCode=0 Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.041969 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"546d2f27-dfde-4446-978b-19b2e6a1d6a0","Type":"ContainerDied","Data":"38e52c9c0288c0b63675bbc84f846a5149a62c293742776eaf3db9f19a358d2e"} Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.049127 4812 generic.go:334] "Generic (PLEG): container finished" podID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerID="719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de" exitCode=0 Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.049195 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7666fda0-373e-4936-bd6f-ea26691ad9d5","Type":"ContainerDied","Data":"719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de"} Jan 31 04:41:58 crc kubenswrapper[4812]: I0131 04:41:58.169888 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tg59z"] Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.064343 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" event={"ID":"d74b15ea-c24b-479e-9469-ee16f0cc85f0","Type":"ContainerStarted","Data":"0751313fe43ac721041bc470e16461bf007216cc0354b53545fae3718f89cbfd"} Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.066708 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"efd185c8-37f1-4661-b631-524671bff15f","Type":"ContainerStarted","Data":"5b262d6b7da31fc5a2b2f4365848ada534c8499fe6dece82db9aa3201448502c"} Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.068885 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"546d2f27-dfde-4446-978b-19b2e6a1d6a0","Type":"ContainerStarted","Data":"b698949a0a69c7dc614f1c6150a7e3959a0c7af53a41006aaa9bd24009590a46"} Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.070528 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7666fda0-373e-4936-bd6f-ea26691ad9d5","Type":"ContainerStarted","Data":"5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae"} Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.072038 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"014dcd69-d412-4c77-8a96-521ffc036f50","Type":"ContainerStarted","Data":"7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce"} Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.072167 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/memcached-0" Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.094790 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-1" podStartSLOduration=10.234549947 podStartE2EDuration="17.09477494s" podCreationTimestamp="2026-01-31 04:41:43 +0000 UTC" firstStartedPulling="2026-01-31 04:41:47.162169495 +0000 UTC m=+915.657191160" lastFinishedPulling="2026-01-31 04:41:54.022394448 +0000 UTC m=+922.517416153" observedRunningTime="2026-01-31 04:42:00.09065848 +0000 UTC m=+928.585680155" watchObservedRunningTime="2026-01-31 04:42:00.09477494 +0000 UTC m=+928.589796605" Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.108667 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/memcached-0" podStartSLOduration=2.352045322 podStartE2EDuration="5.108649593s" podCreationTimestamp="2026-01-31 04:41:55 +0000 UTC" firstStartedPulling="2026-01-31 04:41:56.71200212 +0000 UTC m=+925.207023785" lastFinishedPulling="2026-01-31 04:41:59.468606391 +0000 UTC m=+927.963628056" observedRunningTime="2026-01-31 04:42:00.105445657 +0000 UTC m=+928.600467312" watchObservedRunningTime="2026-01-31 04:42:00.108649593 +0000 UTC m=+928.603671248" Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.132224 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-0" podStartSLOduration=9.982350847 podStartE2EDuration="17.132203425s" podCreationTimestamp="2026-01-31 04:41:43 +0000 UTC" firstStartedPulling="2026-01-31 04:41:46.869641682 +0000 UTC m=+915.364663347" lastFinishedPulling="2026-01-31 04:41:54.01949427 +0000 UTC m=+922.514515925" observedRunningTime="2026-01-31 04:42:00.131360612 +0000 UTC m=+928.626382277" watchObservedRunningTime="2026-01-31 04:42:00.132203425 +0000 UTC m=+928.627225090" Jan 31 04:42:00 crc kubenswrapper[4812]: I0131 04:42:00.157231 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstack-galera-2" podStartSLOduration=9.997282129 podStartE2EDuration="17.157214287s" podCreationTimestamp="2026-01-31 04:41:43 +0000 UTC" firstStartedPulling="2026-01-31 04:41:46.909699038 +0000 UTC m=+915.404720713" lastFinishedPulling="2026-01-31 04:41:54.069631196 +0000 UTC m=+922.564652871" observedRunningTime="2026-01-31 04:42:00.156124227 +0000 UTC m=+928.651145892" watchObservedRunningTime="2026-01-31 04:42:00.157214287 +0000 UTC m=+928.652235952" Jan 31 04:42:02 crc kubenswrapper[4812]: E0131 04:42:02.098826 4812 upgradeaware.go:441] Error proxying data from backend to client: writeto tcp 38.102.83.238:33628->38.102.83.238:34831: read tcp 38.102.83.238:33628->38.102.83.238:34831: read: connection reset by peer Jan 31 04:42:03 crc kubenswrapper[4812]: I0131 04:42:03.113891 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" event={"ID":"d74b15ea-c24b-479e-9469-ee16f0cc85f0","Type":"ContainerStarted","Data":"1142f1825e9bea38609ccde8f470500672c0dc3ca2026322ae9e5a66d9fe4fdd"} Jan 31 04:42:03 crc kubenswrapper[4812]: I0131 04:42:03.130212 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" podStartSLOduration=2.979600559 podStartE2EDuration="6.130193526s" podCreationTimestamp="2026-01-31 04:41:57 +0000 UTC" firstStartedPulling="2026-01-31 04:41:59.43204188 +0000 UTC m=+927.927063545" lastFinishedPulling="2026-01-31 04:42:02.582634847 +0000 UTC m=+931.077656512" observedRunningTime="2026-01-31 04:42:03.129172078 +0000 UTC m=+931.624193743" watchObservedRunningTime="2026-01-31 04:42:03.130193526 +0000 UTC m=+931.625215211" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.246081 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/memcached-0" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.610609 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.610694 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.628662 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.628766 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.647759 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:42:06 crc kubenswrapper[4812]: I0131 04:42:06.647851 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:42:07 crc kubenswrapper[4812]: I0131 04:42:07.747103 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:42:07 crc kubenswrapper[4812]: I0131 04:42:07.747395 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:42:07 crc kubenswrapper[4812]: I0131 04:42:07.780151 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:42:08 crc kubenswrapper[4812]: I0131 04:42:08.176948 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.227068 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v"] Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.228530 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.231692 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sklpv" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.246986 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v"] Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.317622 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.317667 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6c9lc\" (UniqueName: \"kubernetes.io/projected/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-kube-api-access-6c9lc\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.317707 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.418918 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6c9lc\" (UniqueName: \"kubernetes.io/projected/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-kube-api-access-6c9lc\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.418978 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.419085 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.419575 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-util\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.420154 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-bundle\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.442661 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6c9lc\" (UniqueName: \"kubernetes.io/projected/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-kube-api-access-6c9lc\") pod \"9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.545147 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:09 crc kubenswrapper[4812]: I0131 04:42:09.861136 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v"] Jan 31 04:42:09 crc kubenswrapper[4812]: W0131 04:42:09.869133 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ee7c2e6_bdce_4bd4_80f9_7745b0072f56.slice/crio-ea4ad66d8022e267fc0b2ba0a439d4750b9f260f40eea9336aa4acf0b71d2d8e WatchSource:0}: Error finding container ea4ad66d8022e267fc0b2ba0a439d4750b9f260f40eea9336aa4acf0b71d2d8e: Status 404 returned error can't find the container with id ea4ad66d8022e267fc0b2ba0a439d4750b9f260f40eea9336aa4acf0b71d2d8e Jan 31 04:42:10 crc kubenswrapper[4812]: I0131 04:42:10.160299 4812 generic.go:334] "Generic (PLEG): container finished" podID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerID="f8a54cd567bfe9b9b5f8f1c625fb0f15433914ed7a8f1f19173dfa4b78eb08a2" exitCode=0 Jan 31 04:42:10 crc kubenswrapper[4812]: I0131 04:42:10.160671 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" event={"ID":"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56","Type":"ContainerDied","Data":"f8a54cd567bfe9b9b5f8f1c625fb0f15433914ed7a8f1f19173dfa4b78eb08a2"} Jan 31 04:42:10 crc kubenswrapper[4812]: I0131 04:42:10.160721 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" event={"ID":"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56","Type":"ContainerStarted","Data":"ea4ad66d8022e267fc0b2ba0a439d4750b9f260f40eea9336aa4acf0b71d2d8e"} Jan 31 04:42:10 crc kubenswrapper[4812]: I0131 04:42:10.802269 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:42:10 crc kubenswrapper[4812]: I0131 04:42:10.910345 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:42:11 crc kubenswrapper[4812]: I0131 04:42:11.169525 4812 generic.go:334] "Generic (PLEG): container finished" podID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerID="7b4f980d4fd4007dcb5809a9be93a83efcdbe26beb11208b5380add0faa92b6b" exitCode=0 Jan 31 04:42:11 crc kubenswrapper[4812]: I0131 04:42:11.169611 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" event={"ID":"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56","Type":"ContainerDied","Data":"7b4f980d4fd4007dcb5809a9be93a83efcdbe26beb11208b5380add0faa92b6b"} Jan 31 04:42:12 crc kubenswrapper[4812]: I0131 04:42:12.179591 4812 generic.go:334] "Generic (PLEG): container finished" podID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerID="2719d15c75272d57c130281742452eba890557a3cbbd31ff392413c151a763f6" exitCode=0 Jan 31 04:42:12 crc kubenswrapper[4812]: I0131 04:42:12.179678 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" event={"ID":"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56","Type":"ContainerDied","Data":"2719d15c75272d57c130281742452eba890557a3cbbd31ff392413c151a763f6"} Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.515867 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.680554 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6c9lc\" (UniqueName: \"kubernetes.io/projected/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-kube-api-access-6c9lc\") pod \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.680638 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-util\") pod \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.680748 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-bundle\") pod \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\" (UID: \"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56\") " Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.681517 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-bundle" (OuterVolumeSpecName: "bundle") pod "8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" (UID: "8ee7c2e6-bdce-4bd4-80f9-7745b0072f56"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.688511 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-kube-api-access-6c9lc" (OuterVolumeSpecName: "kube-api-access-6c9lc") pod "8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" (UID: "8ee7c2e6-bdce-4bd4-80f9-7745b0072f56"). InnerVolumeSpecName "kube-api-access-6c9lc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.702565 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-util" (OuterVolumeSpecName: "util") pod "8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" (UID: "8ee7c2e6-bdce-4bd4-80f9-7745b0072f56"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.782542 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.782573 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6c9lc\" (UniqueName: \"kubernetes.io/projected/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-kube-api-access-6c9lc\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.782587 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.806910 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-ztrnl"] Jan 31 04:42:13 crc kubenswrapper[4812]: E0131 04:42:13.807186 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="extract" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.807203 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="extract" Jan 31 04:42:13 crc kubenswrapper[4812]: E0131 04:42:13.807228 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="util" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.807236 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="util" Jan 31 04:42:13 crc kubenswrapper[4812]: E0131 04:42:13.807257 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="pull" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.807265 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="pull" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.807394 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" containerName="extract" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.808107 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.809879 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.820211 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ztrnl"] Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.883718 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jlkck\" (UniqueName: \"kubernetes.io/projected/718bb9c9-540b-4aae-ae69-fff60f342873-kube-api-access-jlkck\") pod \"root-account-create-update-ztrnl\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.883812 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718bb9c9-540b-4aae-ae69-fff60f342873-operator-scripts\") pod \"root-account-create-update-ztrnl\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.985524 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718bb9c9-540b-4aae-ae69-fff60f342873-operator-scripts\") pod \"root-account-create-update-ztrnl\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.985605 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jlkck\" (UniqueName: \"kubernetes.io/projected/718bb9c9-540b-4aae-ae69-fff60f342873-kube-api-access-jlkck\") pod \"root-account-create-update-ztrnl\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:13 crc kubenswrapper[4812]: I0131 04:42:13.986332 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718bb9c9-540b-4aae-ae69-fff60f342873-operator-scripts\") pod \"root-account-create-update-ztrnl\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:14 crc kubenswrapper[4812]: I0131 04:42:14.001830 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jlkck\" (UniqueName: \"kubernetes.io/projected/718bb9c9-540b-4aae-ae69-fff60f342873-kube-api-access-jlkck\") pod \"root-account-create-update-ztrnl\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:14 crc kubenswrapper[4812]: I0131 04:42:14.121551 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:14 crc kubenswrapper[4812]: I0131 04:42:14.195076 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" event={"ID":"8ee7c2e6-bdce-4bd4-80f9-7745b0072f56","Type":"ContainerDied","Data":"ea4ad66d8022e267fc0b2ba0a439d4750b9f260f40eea9336aa4acf0b71d2d8e"} Jan 31 04:42:14 crc kubenswrapper[4812]: I0131 04:42:14.195130 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ea4ad66d8022e267fc0b2ba0a439d4750b9f260f40eea9336aa4acf0b71d2d8e" Jan 31 04:42:14 crc kubenswrapper[4812]: I0131 04:42:14.195220 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v" Jan 31 04:42:14 crc kubenswrapper[4812]: I0131 04:42:14.554371 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ztrnl"] Jan 31 04:42:15 crc kubenswrapper[4812]: I0131 04:42:15.203446 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ztrnl" event={"ID":"718bb9c9-540b-4aae-ae69-fff60f342873","Type":"ContainerStarted","Data":"7b4ff425ca2c3c6eae96193094946d849383132389430ad3a9e79b7b0750cb5f"} Jan 31 04:42:16 crc kubenswrapper[4812]: I0131 04:42:16.210133 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ztrnl" event={"ID":"718bb9c9-540b-4aae-ae69-fff60f342873","Type":"ContainerStarted","Data":"badd670fdeba9d243fe153b69dda39206396e3b07a8eb8d911b7ea393b4fd59b"} Jan 31 04:42:16 crc kubenswrapper[4812]: I0131 04:42:16.227186 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/root-account-create-update-ztrnl" podStartSLOduration=3.227164763 podStartE2EDuration="3.227164763s" podCreationTimestamp="2026-01-31 04:42:13 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:42:16.225521549 +0000 UTC m=+944.720543234" watchObservedRunningTime="2026-01-31 04:42:16.227164763 +0000 UTC m=+944.722186438" Jan 31 04:42:16 crc kubenswrapper[4812]: I0131 04:42:16.769385 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/openstack-galera-2" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="galera" probeResult="failure" output=< Jan 31 04:42:16 crc kubenswrapper[4812]: wsrep_local_state_comment (Donor/Desynced) differs from Synced Jan 31 04:42:16 crc kubenswrapper[4812]: > Jan 31 04:42:18 crc kubenswrapper[4812]: I0131 04:42:18.228638 4812 generic.go:334] "Generic (PLEG): container finished" podID="718bb9c9-540b-4aae-ae69-fff60f342873" containerID="badd670fdeba9d243fe153b69dda39206396e3b07a8eb8d911b7ea393b4fd59b" exitCode=0 Jan 31 04:42:18 crc kubenswrapper[4812]: I0131 04:42:18.228709 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ztrnl" event={"ID":"718bb9c9-540b-4aae-ae69-fff60f342873","Type":"ContainerDied","Data":"badd670fdeba9d243fe153b69dda39206396e3b07a8eb8d911b7ea393b4fd59b"} Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.592508 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.658968 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jlkck\" (UniqueName: \"kubernetes.io/projected/718bb9c9-540b-4aae-ae69-fff60f342873-kube-api-access-jlkck\") pod \"718bb9c9-540b-4aae-ae69-fff60f342873\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.659091 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718bb9c9-540b-4aae-ae69-fff60f342873-operator-scripts\") pod \"718bb9c9-540b-4aae-ae69-fff60f342873\" (UID: \"718bb9c9-540b-4aae-ae69-fff60f342873\") " Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.660020 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/718bb9c9-540b-4aae-ae69-fff60f342873-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "718bb9c9-540b-4aae-ae69-fff60f342873" (UID: "718bb9c9-540b-4aae-ae69-fff60f342873"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.666028 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/718bb9c9-540b-4aae-ae69-fff60f342873-kube-api-access-jlkck" (OuterVolumeSpecName: "kube-api-access-jlkck") pod "718bb9c9-540b-4aae-ae69-fff60f342873" (UID: "718bb9c9-540b-4aae-ae69-fff60f342873"). InnerVolumeSpecName "kube-api-access-jlkck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.760819 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jlkck\" (UniqueName: \"kubernetes.io/projected/718bb9c9-540b-4aae-ae69-fff60f342873-kube-api-access-jlkck\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:19 crc kubenswrapper[4812]: I0131 04:42:19.760863 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/718bb9c9-540b-4aae-ae69-fff60f342873-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.161184 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc"] Jan 31 04:42:20 crc kubenswrapper[4812]: E0131 04:42:20.161509 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="718bb9c9-540b-4aae-ae69-fff60f342873" containerName="mariadb-account-create-update" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.161533 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="718bb9c9-540b-4aae-ae69-fff60f342873" containerName="mariadb-account-create-update" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.161674 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="718bb9c9-540b-4aae-ae69-fff60f342873" containerName="mariadb-account-create-update" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.162257 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.165756 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"rabbitmq-cluster-operator-dockercfg-gcsgz" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.183791 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc"] Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.240460 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/root-account-create-update-ztrnl" event={"ID":"718bb9c9-540b-4aae-ae69-fff60f342873","Type":"ContainerDied","Data":"7b4ff425ca2c3c6eae96193094946d849383132389430ad3a9e79b7b0750cb5f"} Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.240495 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="7b4ff425ca2c3c6eae96193094946d849383132389430ad3a9e79b7b0750cb5f" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.240543 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-ztrnl" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.267793 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gndzh\" (UniqueName: \"kubernetes.io/projected/5f08c58b-776d-4693-a282-e192ecc83bc2-kube-api-access-gndzh\") pod \"rabbitmq-cluster-operator-779fc9694b-7jhbc\" (UID: \"5f08c58b-776d-4693-a282-e192ecc83bc2\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.369247 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gndzh\" (UniqueName: \"kubernetes.io/projected/5f08c58b-776d-4693-a282-e192ecc83bc2-kube-api-access-gndzh\") pod \"rabbitmq-cluster-operator-779fc9694b-7jhbc\" (UID: \"5f08c58b-776d-4693-a282-e192ecc83bc2\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.391997 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gndzh\" (UniqueName: \"kubernetes.io/projected/5f08c58b-776d-4693-a282-e192ecc83bc2-kube-api-access-gndzh\") pod \"rabbitmq-cluster-operator-779fc9694b-7jhbc\" (UID: \"5f08c58b-776d-4693-a282-e192ecc83bc2\") " pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:42:20 crc kubenswrapper[4812]: I0131 04:42:20.475403 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:42:20 crc kubenswrapper[4812]: E0131 04:42:20.672304 4812 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:60894->38.102.83.238:34831: write tcp 38.102.83.238:60894->38.102.83.238:34831: write: broken pipe Jan 31 04:42:20 crc kubenswrapper[4812]: E0131 04:42:20.724662 4812 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:60896->38.102.83.238:34831: write tcp 38.102.83.238:60896->38.102.83.238:34831: write: broken pipe Jan 31 04:42:20 crc kubenswrapper[4812]: W0131 04:42:20.996547 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod5f08c58b_776d_4693_a282_e192ecc83bc2.slice/crio-b70424e23323900a46dc8c57e4ef9e52a85ac47d08b84b4ec9c317010c8c4a9e WatchSource:0}: Error finding container b70424e23323900a46dc8c57e4ef9e52a85ac47d08b84b4ec9c317010c8c4a9e: Status 404 returned error can't find the container with id b70424e23323900a46dc8c57e4ef9e52a85ac47d08b84b4ec9c317010c8c4a9e Jan 31 04:42:21 crc kubenswrapper[4812]: I0131 04:42:21.004610 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc"] Jan 31 04:42:21 crc kubenswrapper[4812]: I0131 04:42:21.251603 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" event={"ID":"5f08c58b-776d-4693-a282-e192ecc83bc2","Type":"ContainerStarted","Data":"b70424e23323900a46dc8c57e4ef9e52a85ac47d08b84b4ec9c317010c8c4a9e"} Jan 31 04:42:21 crc kubenswrapper[4812]: I0131 04:42:21.907686 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:42:21 crc kubenswrapper[4812]: I0131 04:42:21.983961 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:42:25 crc kubenswrapper[4812]: I0131 04:42:25.280899 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" event={"ID":"5f08c58b-776d-4693-a282-e192ecc83bc2","Type":"ContainerStarted","Data":"4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab"} Jan 31 04:42:25 crc kubenswrapper[4812]: I0131 04:42:25.297922 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" podStartSLOduration=2.058569347 podStartE2EDuration="5.297899626s" podCreationTimestamp="2026-01-31 04:42:20 +0000 UTC" firstStartedPulling="2026-01-31 04:42:20.999660921 +0000 UTC m=+949.494682606" lastFinishedPulling="2026-01-31 04:42:24.2389912 +0000 UTC m=+952.734012885" observedRunningTime="2026-01-31 04:42:25.297160667 +0000 UTC m=+953.792182382" watchObservedRunningTime="2026-01-31 04:42:25.297899626 +0000 UTC m=+953.792921311" Jan 31 04:42:25 crc kubenswrapper[4812]: I0131 04:42:25.574900 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:42:25 crc kubenswrapper[4812]: I0131 04:42:25.663249 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.377192 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-v24qf"] Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.378765 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.391629 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v24qf"] Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.478099 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-utilities\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.478158 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tz4hg\" (UniqueName: \"kubernetes.io/projected/ac97f081-ab81-4678-9f9c-f9452de65591-kube-api-access-tz4hg\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.478207 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-catalog-content\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.579901 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-utilities\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.579960 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tz4hg\" (UniqueName: \"kubernetes.io/projected/ac97f081-ab81-4678-9f9c-f9452de65591-kube-api-access-tz4hg\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.580006 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-catalog-content\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.580480 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-catalog-content\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.580693 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-utilities\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.598756 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tz4hg\" (UniqueName: \"kubernetes.io/projected/ac97f081-ab81-4678-9f9c-f9452de65591-kube-api-access-tz4hg\") pod \"certified-operators-v24qf\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:27 crc kubenswrapper[4812]: I0131 04:42:27.700259 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.105820 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-v24qf"] Jan 31 04:42:28 crc kubenswrapper[4812]: W0131 04:42:28.110408 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podac97f081_ab81_4678_9f9c_f9452de65591.slice/crio-2a7c96e5f7fef8ea18e4340b7b204878ce0d00ac6ca1bc9c5fb4d4a9c8975aa8 WatchSource:0}: Error finding container 2a7c96e5f7fef8ea18e4340b7b204878ce0d00ac6ca1bc9c5fb4d4a9c8975aa8: Status 404 returned error can't find the container with id 2a7c96e5f7fef8ea18e4340b7b204878ce0d00ac6ca1bc9c5fb4d4a9c8975aa8 Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.248631 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.249472 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.251491 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-default-user" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.252609 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-erlang-cookie" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.252722 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-plugins-conf" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.252856 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"rabbitmq-server-conf" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.253369 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"rabbitmq-server-dockercfg-2gtvf" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.261117 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305214 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305279 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305314 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305339 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6w8zm\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-kube-api-access-6w8zm\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305358 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305400 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305540 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.305581 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.313096 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerStarted","Data":"667180b8af4897822b0a1d3ce579744faa7c83e5c724e4fecffd646db14ef5df"} Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.313141 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerStarted","Data":"2a7c96e5f7fef8ea18e4340b7b204878ce0d00ac6ca1bc9c5fb4d4a9c8975aa8"} Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406628 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406709 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6w8zm\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-kube-api-access-6w8zm\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406733 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406797 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406907 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406925 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406943 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.406979 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.408538 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-erlang-cookie\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.408695 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-plugins\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.409081 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-plugins-conf\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.413371 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-confd\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.415511 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-erlang-cookie-secret\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.417360 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-pod-info\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.430678 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6w8zm\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-kube-api-access-6w8zm\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.432062 4812 csi_attacher.go:380] kubernetes.io/csi: attacher.MountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping MountDevice... Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.432098 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") device mount path \"/var/lib/kubelet/plugins/kubernetes.io/csi/kubevirt.io.hostpath-provisioner/7ecca3bcf55214e97eadee78e5aee987aaff3fdb5857925c04107adbf1f5aca7/globalmount\"" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.530228 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") pod \"rabbitmq-server-0\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:28 crc kubenswrapper[4812]: I0131 04:42:28.620676 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.091497 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.321585 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"9ee32a7e-691a-4b75-b7ae-e32b64c41b36","Type":"ContainerStarted","Data":"26c2192037c4aae70592c14ee82aef35af6e4bd5e129e7b3fc890b0bf7bf9f2a"} Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.323057 4812 generic.go:334] "Generic (PLEG): container finished" podID="ac97f081-ab81-4678-9f9c-f9452de65591" containerID="667180b8af4897822b0a1d3ce579744faa7c83e5c724e4fecffd646db14ef5df" exitCode=0 Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.323084 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerDied","Data":"667180b8af4897822b0a1d3ce579744faa7c83e5c724e4fecffd646db14ef5df"} Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.972677 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-vgdtb"] Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.974130 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:29 crc kubenswrapper[4812]: I0131 04:42:29.980953 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgdtb"] Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.027381 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-544wx\" (UniqueName: \"kubernetes.io/projected/b330b138-071d-4564-b2c0-b87753e0ed63-kube-api-access-544wx\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.027440 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-utilities\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.027459 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-catalog-content\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.132568 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-utilities\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.132631 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-catalog-content\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.132713 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-544wx\" (UniqueName: \"kubernetes.io/projected/b330b138-071d-4564-b2c0-b87753e0ed63-kube-api-access-544wx\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.134128 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-utilities\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.134222 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-catalog-content\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.167417 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-544wx\" (UniqueName: \"kubernetes.io/projected/b330b138-071d-4564-b2c0-b87753e0ed63-kube-api-access-544wx\") pod \"community-operators-vgdtb\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.338468 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.343701 4812 generic.go:334] "Generic (PLEG): container finished" podID="ac97f081-ab81-4678-9f9c-f9452de65591" containerID="621e9e9850a627512525c7b29381d6871c542ea251a516fe5c7dc33c478efed8" exitCode=0 Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.363332 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerDied","Data":"621e9e9850a627512525c7b29381d6871c542ea251a516fe5c7dc33c478efed8"} Jan 31 04:42:30 crc kubenswrapper[4812]: I0131 04:42:30.815029 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-vgdtb"] Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.351255 4812 generic.go:334] "Generic (PLEG): container finished" podID="b330b138-071d-4564-b2c0-b87753e0ed63" containerID="d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540" exitCode=0 Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.352090 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerDied","Data":"d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540"} Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.352114 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerStarted","Data":"6449d5740b9b67f15cb773d9a30201cc944524f9f7132c6934aa81b36f2996ff"} Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.359466 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerStarted","Data":"1a7ecd6e634374e2bf29835f3410b321d69c7c0c1d84e3ba7116af1752cbe9ac"} Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.376933 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-index-kgblt"] Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.378060 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.382778 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-index-dockercfg-wvd2s" Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.401712 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-kgblt"] Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.415903 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-v24qf" podStartSLOduration=2.900874964 podStartE2EDuration="4.415881614s" podCreationTimestamp="2026-01-31 04:42:27 +0000 UTC" firstStartedPulling="2026-01-31 04:42:29.32449045 +0000 UTC m=+957.819512115" lastFinishedPulling="2026-01-31 04:42:30.8394971 +0000 UTC m=+959.334518765" observedRunningTime="2026-01-31 04:42:31.409515702 +0000 UTC m=+959.904537367" watchObservedRunningTime="2026-01-31 04:42:31.415881614 +0000 UTC m=+959.910903289" Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.457181 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7n8p\" (UniqueName: \"kubernetes.io/projected/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b-kube-api-access-v7n8p\") pod \"keystone-operator-index-kgblt\" (UID: \"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b\") " pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.559599 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v7n8p\" (UniqueName: \"kubernetes.io/projected/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b-kube-api-access-v7n8p\") pod \"keystone-operator-index-kgblt\" (UID: \"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b\") " pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.576789 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7n8p\" (UniqueName: \"kubernetes.io/projected/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b-kube-api-access-v7n8p\") pod \"keystone-operator-index-kgblt\" (UID: \"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b\") " pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:31 crc kubenswrapper[4812]: I0131 04:42:31.702577 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:34 crc kubenswrapper[4812]: I0131 04:42:34.436222 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-index-kgblt"] Jan 31 04:42:34 crc kubenswrapper[4812]: W0131 04:42:34.878572 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod928c7fc4_4d7d_43b9_9ae9_6b350dd4be1b.slice/crio-d130535c39f9b4bbed8fd34e4d59abdacf62747877ebc07862e17a886243ace1 WatchSource:0}: Error finding container d130535c39f9b4bbed8fd34e4d59abdacf62747877ebc07862e17a886243ace1: Status 404 returned error can't find the container with id d130535c39f9b4bbed8fd34e4d59abdacf62747877ebc07862e17a886243ace1 Jan 31 04:42:35 crc kubenswrapper[4812]: I0131 04:42:35.437474 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-kgblt" event={"ID":"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b","Type":"ContainerStarted","Data":"d130535c39f9b4bbed8fd34e4d59abdacf62747877ebc07862e17a886243ace1"} Jan 31 04:42:36 crc kubenswrapper[4812]: I0131 04:42:36.449039 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerStarted","Data":"422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa"} Jan 31 04:42:37 crc kubenswrapper[4812]: I0131 04:42:37.457664 4812 generic.go:334] "Generic (PLEG): container finished" podID="b330b138-071d-4564-b2c0-b87753e0ed63" containerID="422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa" exitCode=0 Jan 31 04:42:37 crc kubenswrapper[4812]: I0131 04:42:37.457753 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerDied","Data":"422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa"} Jan 31 04:42:37 crc kubenswrapper[4812]: I0131 04:42:37.462765 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"9ee32a7e-691a-4b75-b7ae-e32b64c41b36","Type":"ContainerStarted","Data":"26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072"} Jan 31 04:42:37 crc kubenswrapper[4812]: I0131 04:42:37.701106 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:37 crc kubenswrapper[4812]: I0131 04:42:37.701625 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:37 crc kubenswrapper[4812]: I0131 04:42:37.778867 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:38 crc kubenswrapper[4812]: I0131 04:42:38.472243 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerStarted","Data":"785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3"} Jan 31 04:42:38 crc kubenswrapper[4812]: I0131 04:42:38.474136 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-kgblt" event={"ID":"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b","Type":"ContainerStarted","Data":"305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d"} Jan 31 04:42:38 crc kubenswrapper[4812]: I0131 04:42:38.502324 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-vgdtb" podStartSLOduration=3.699840041 podStartE2EDuration="9.502302257s" podCreationTimestamp="2026-01-31 04:42:29 +0000 UTC" firstStartedPulling="2026-01-31 04:42:32.145221543 +0000 UTC m=+960.640243208" lastFinishedPulling="2026-01-31 04:42:37.947683759 +0000 UTC m=+966.442705424" observedRunningTime="2026-01-31 04:42:38.49981574 +0000 UTC m=+966.994837445" watchObservedRunningTime="2026-01-31 04:42:38.502302257 +0000 UTC m=+966.997323932" Jan 31 04:42:38 crc kubenswrapper[4812]: I0131 04:42:38.525778 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:38 crc kubenswrapper[4812]: I0131 04:42:38.531381 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-index-kgblt" podStartSLOduration=5.05831761 podStartE2EDuration="7.531354178s" podCreationTimestamp="2026-01-31 04:42:31 +0000 UTC" firstStartedPulling="2026-01-31 04:42:34.881159749 +0000 UTC m=+963.376181414" lastFinishedPulling="2026-01-31 04:42:37.354196317 +0000 UTC m=+965.849217982" observedRunningTime="2026-01-31 04:42:38.524977776 +0000 UTC m=+967.019999461" watchObservedRunningTime="2026-01-31 04:42:38.531354178 +0000 UTC m=+967.026375893" Jan 31 04:42:40 crc kubenswrapper[4812]: I0131 04:42:40.172669 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v24qf"] Jan 31 04:42:40 crc kubenswrapper[4812]: I0131 04:42:40.351725 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:40 crc kubenswrapper[4812]: I0131 04:42:40.352133 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:41 crc kubenswrapper[4812]: I0131 04:42:41.403779 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/community-operators-vgdtb" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="registry-server" probeResult="failure" output=< Jan 31 04:42:41 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:42:41 crc kubenswrapper[4812]: > Jan 31 04:42:41 crc kubenswrapper[4812]: I0131 04:42:41.494822 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-v24qf" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="registry-server" containerID="cri-o://1a7ecd6e634374e2bf29835f3410b321d69c7c0c1d84e3ba7116af1752cbe9ac" gracePeriod=2 Jan 31 04:42:41 crc kubenswrapper[4812]: I0131 04:42:41.702917 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:41 crc kubenswrapper[4812]: I0131 04:42:41.703009 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:41 crc kubenswrapper[4812]: I0131 04:42:41.731646 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:42 crc kubenswrapper[4812]: I0131 04:42:42.536954 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:42:44 crc kubenswrapper[4812]: I0131 04:42:44.339343 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:42:44 crc kubenswrapper[4812]: I0131 04:42:44.339722 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:42:45 crc kubenswrapper[4812]: I0131 04:42:45.527713 4812 generic.go:334] "Generic (PLEG): container finished" podID="ac97f081-ab81-4678-9f9c-f9452de65591" containerID="1a7ecd6e634374e2bf29835f3410b321d69c7c0c1d84e3ba7116af1752cbe9ac" exitCode=0 Jan 31 04:42:45 crc kubenswrapper[4812]: I0131 04:42:45.527772 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerDied","Data":"1a7ecd6e634374e2bf29835f3410b321d69c7c0c1d84e3ba7116af1752cbe9ac"} Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.221697 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5"] Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.223642 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.225664 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sklpv" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.239434 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5"] Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.375108 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.375388 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nmzqx\" (UniqueName: \"kubernetes.io/projected/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-kube-api-access-nmzqx\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.375546 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.476787 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-nmzqx\" (UniqueName: \"kubernetes.io/projected/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-kube-api-access-nmzqx\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.476943 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.476989 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.477893 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-util\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.477990 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-bundle\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.509149 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-nmzqx\" (UniqueName: \"kubernetes.io/projected/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-kube-api-access-nmzqx\") pod \"b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:46 crc kubenswrapper[4812]: I0131 04:42:46.547570 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.001576 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5"] Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.079603 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.193736 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-utilities\") pod \"ac97f081-ab81-4678-9f9c-f9452de65591\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.193901 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tz4hg\" (UniqueName: \"kubernetes.io/projected/ac97f081-ab81-4678-9f9c-f9452de65591-kube-api-access-tz4hg\") pod \"ac97f081-ab81-4678-9f9c-f9452de65591\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.193963 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-catalog-content\") pod \"ac97f081-ab81-4678-9f9c-f9452de65591\" (UID: \"ac97f081-ab81-4678-9f9c-f9452de65591\") " Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.200366 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-utilities" (OuterVolumeSpecName: "utilities") pod "ac97f081-ab81-4678-9f9c-f9452de65591" (UID: "ac97f081-ab81-4678-9f9c-f9452de65591"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.202880 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac97f081-ab81-4678-9f9c-f9452de65591-kube-api-access-tz4hg" (OuterVolumeSpecName: "kube-api-access-tz4hg") pod "ac97f081-ab81-4678-9f9c-f9452de65591" (UID: "ac97f081-ab81-4678-9f9c-f9452de65591"). InnerVolumeSpecName "kube-api-access-tz4hg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.266786 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "ac97f081-ab81-4678-9f9c-f9452de65591" (UID: "ac97f081-ab81-4678-9f9c-f9452de65591"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.296178 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tz4hg\" (UniqueName: \"kubernetes.io/projected/ac97f081-ab81-4678-9f9c-f9452de65591-kube-api-access-tz4hg\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.296222 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.296236 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/ac97f081-ab81-4678-9f9c-f9452de65591-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.541538 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" event={"ID":"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8","Type":"ContainerStarted","Data":"8cfb43a4c5b2620d0b399f1e7fca42cbca573aa555a2d9cc01c6227f54eaabcd"} Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.543785 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-v24qf" event={"ID":"ac97f081-ab81-4678-9f9c-f9452de65591","Type":"ContainerDied","Data":"2a7c96e5f7fef8ea18e4340b7b204878ce0d00ac6ca1bc9c5fb4d4a9c8975aa8"} Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.543858 4812 scope.go:117] "RemoveContainer" containerID="1a7ecd6e634374e2bf29835f3410b321d69c7c0c1d84e3ba7116af1752cbe9ac" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.543978 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-v24qf" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.561524 4812 scope.go:117] "RemoveContainer" containerID="621e9e9850a627512525c7b29381d6871c542ea251a516fe5c7dc33c478efed8" Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.571846 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-v24qf"] Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.575774 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-v24qf"] Jan 31 04:42:47 crc kubenswrapper[4812]: I0131 04:42:47.600221 4812 scope.go:117] "RemoveContainer" containerID="667180b8af4897822b0a1d3ce579744faa7c83e5c724e4fecffd646db14ef5df" Jan 31 04:42:48 crc kubenswrapper[4812]: I0131 04:42:48.350682 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" path="/var/lib/kubelet/pods/ac97f081-ab81-4678-9f9c-f9452de65591/volumes" Jan 31 04:42:49 crc kubenswrapper[4812]: I0131 04:42:49.562374 4812 generic.go:334] "Generic (PLEG): container finished" podID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerID="c45a09b678314eb929c988fb9029a74d43c1da36cd2dc8ae1460e41d6e16d354" exitCode=0 Jan 31 04:42:49 crc kubenswrapper[4812]: I0131 04:42:49.562495 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" event={"ID":"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8","Type":"ContainerDied","Data":"c45a09b678314eb929c988fb9029a74d43c1da36cd2dc8ae1460e41d6e16d354"} Jan 31 04:42:50 crc kubenswrapper[4812]: I0131 04:42:50.396516 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:50 crc kubenswrapper[4812]: I0131 04:42:50.521351 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:52 crc kubenswrapper[4812]: I0131 04:42:52.585828 4812 generic.go:334] "Generic (PLEG): container finished" podID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerID="3713b5538e0d15abbe0fdf84d12f55791eb2baa76307d6c76c24e27082f7b7ea" exitCode=0 Jan 31 04:42:52 crc kubenswrapper[4812]: I0131 04:42:52.585925 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" event={"ID":"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8","Type":"ContainerDied","Data":"3713b5538e0d15abbe0fdf84d12f55791eb2baa76307d6c76c24e27082f7b7ea"} Jan 31 04:42:53 crc kubenswrapper[4812]: I0131 04:42:53.597056 4812 generic.go:334] "Generic (PLEG): container finished" podID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerID="f135e6cd7937e468ad1585a4cd7483d96a6e60dfb2f3ca521cd1d8e3d4a07d00" exitCode=0 Jan 31 04:42:53 crc kubenswrapper[4812]: I0131 04:42:53.597152 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" event={"ID":"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8","Type":"ContainerDied","Data":"f135e6cd7937e468ad1585a4cd7483d96a6e60dfb2f3ca521cd1d8e3d4a07d00"} Jan 31 04:42:54 crc kubenswrapper[4812]: I0131 04:42:54.985157 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.108636 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nmzqx\" (UniqueName: \"kubernetes.io/projected/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-kube-api-access-nmzqx\") pod \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.108737 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-bundle\") pod \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.108780 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-util\") pod \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\" (UID: \"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8\") " Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.110245 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-bundle" (OuterVolumeSpecName: "bundle") pod "ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" (UID: "ac3f10a5-54f7-47a3-b6b4-412f5eae07f8"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.118065 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-kube-api-access-nmzqx" (OuterVolumeSpecName: "kube-api-access-nmzqx") pod "ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" (UID: "ac3f10a5-54f7-47a3-b6b4-412f5eae07f8"). InnerVolumeSpecName "kube-api-access-nmzqx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.119898 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-util" (OuterVolumeSpecName: "util") pod "ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" (UID: "ac3f10a5-54f7-47a3-b6b4-412f5eae07f8"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.210674 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-nmzqx\" (UniqueName: \"kubernetes.io/projected/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-kube-api-access-nmzqx\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.210733 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.210784 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.564075 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgdtb"] Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.564323 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-vgdtb" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="registry-server" containerID="cri-o://785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3" gracePeriod=2 Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.618169 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" event={"ID":"ac3f10a5-54f7-47a3-b6b4-412f5eae07f8","Type":"ContainerDied","Data":"8cfb43a4c5b2620d0b399f1e7fca42cbca573aa555a2d9cc01c6227f54eaabcd"} Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.618215 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="8cfb43a4c5b2620d0b399f1e7fca42cbca573aa555a2d9cc01c6227f54eaabcd" Jan 31 04:42:55 crc kubenswrapper[4812]: I0131 04:42:55.618287 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.502509 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.629290 4812 generic.go:334] "Generic (PLEG): container finished" podID="b330b138-071d-4564-b2c0-b87753e0ed63" containerID="785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3" exitCode=0 Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.629343 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-vgdtb" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.629357 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerDied","Data":"785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3"} Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.629410 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-vgdtb" event={"ID":"b330b138-071d-4564-b2c0-b87753e0ed63","Type":"ContainerDied","Data":"6449d5740b9b67f15cb773d9a30201cc944524f9f7132c6934aa81b36f2996ff"} Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.629440 4812 scope.go:117] "RemoveContainer" containerID="785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.638723 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-544wx\" (UniqueName: \"kubernetes.io/projected/b330b138-071d-4564-b2c0-b87753e0ed63-kube-api-access-544wx\") pod \"b330b138-071d-4564-b2c0-b87753e0ed63\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.638873 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-utilities\") pod \"b330b138-071d-4564-b2c0-b87753e0ed63\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.638918 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-catalog-content\") pod \"b330b138-071d-4564-b2c0-b87753e0ed63\" (UID: \"b330b138-071d-4564-b2c0-b87753e0ed63\") " Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.643488 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-utilities" (OuterVolumeSpecName: "utilities") pod "b330b138-071d-4564-b2c0-b87753e0ed63" (UID: "b330b138-071d-4564-b2c0-b87753e0ed63"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.648039 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b330b138-071d-4564-b2c0-b87753e0ed63-kube-api-access-544wx" (OuterVolumeSpecName: "kube-api-access-544wx") pod "b330b138-071d-4564-b2c0-b87753e0ed63" (UID: "b330b138-071d-4564-b2c0-b87753e0ed63"). InnerVolumeSpecName "kube-api-access-544wx". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.678528 4812 scope.go:117] "RemoveContainer" containerID="422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.717160 4812 scope.go:117] "RemoveContainer" containerID="d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.726682 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "b330b138-071d-4564-b2c0-b87753e0ed63" (UID: "b330b138-071d-4564-b2c0-b87753e0ed63"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.735714 4812 scope.go:117] "RemoveContainer" containerID="785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3" Jan 31 04:42:56 crc kubenswrapper[4812]: E0131 04:42:56.736106 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3\": container with ID starting with 785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3 not found: ID does not exist" containerID="785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.736140 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3"} err="failed to get container status \"785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3\": rpc error: code = NotFound desc = could not find container \"785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3\": container with ID starting with 785052824d91321bddca060b18e053283b4214cca08137a38ae36b7f7b2131f3 not found: ID does not exist" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.736163 4812 scope.go:117] "RemoveContainer" containerID="422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa" Jan 31 04:42:56 crc kubenswrapper[4812]: E0131 04:42:56.736559 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa\": container with ID starting with 422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa not found: ID does not exist" containerID="422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.736584 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa"} err="failed to get container status \"422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa\": rpc error: code = NotFound desc = could not find container \"422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa\": container with ID starting with 422a397afafec9b1d4ac0101c29aee7df21cf7d0496a5e167820e4a7ccd966fa not found: ID does not exist" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.736598 4812 scope.go:117] "RemoveContainer" containerID="d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540" Jan 31 04:42:56 crc kubenswrapper[4812]: E0131 04:42:56.736768 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540\": container with ID starting with d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540 not found: ID does not exist" containerID="d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.736786 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540"} err="failed to get container status \"d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540\": rpc error: code = NotFound desc = could not find container \"d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540\": container with ID starting with d2d39f15de7c0e1753f18d2329cae587d8d106bf033cdf3f134ebf4a0f285540 not found: ID does not exist" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.740465 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-544wx\" (UniqueName: \"kubernetes.io/projected/b330b138-071d-4564-b2c0-b87753e0ed63-kube-api-access-544wx\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.740772 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.740783 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/b330b138-071d-4564-b2c0-b87753e0ed63-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.962749 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-vgdtb"] Jan 31 04:42:56 crc kubenswrapper[4812]: I0131 04:42:56.973750 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-vgdtb"] Jan 31 04:42:58 crc kubenswrapper[4812]: I0131 04:42:58.349010 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" path="/var/lib/kubelet/pods/b330b138-071d-4564-b2c0-b87753e0ed63/volumes" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.377560 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-2s4dx"] Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378226 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="registry-server" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378242 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="registry-server" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378263 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="extract-utilities" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378273 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="extract-utilities" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378281 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="registry-server" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378289 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="registry-server" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378353 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="pull" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378371 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="pull" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378385 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="extract-content" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378393 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="extract-content" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378403 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="util" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378410 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="util" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378423 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="extract" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378431 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="extract" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378446 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="extract-utilities" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378455 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="extract-utilities" Jan 31 04:43:02 crc kubenswrapper[4812]: E0131 04:43:02.378468 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="extract-content" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378475 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="extract-content" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378608 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac97f081-ab81-4678-9f9c-f9452de65591" containerName="registry-server" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378626 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b330b138-071d-4564-b2c0-b87753e0ed63" containerName="registry-server" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.378640 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" containerName="extract" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.380203 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.390886 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s4dx"] Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.515673 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-utilities\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.515764 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h6bmp\" (UniqueName: \"kubernetes.io/projected/c841ff71-e657-49fb-9390-720eed2b135c-kube-api-access-h6bmp\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.515831 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-catalog-content\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.617155 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-utilities\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.617262 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-h6bmp\" (UniqueName: \"kubernetes.io/projected/c841ff71-e657-49fb-9390-720eed2b135c-kube-api-access-h6bmp\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.617319 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-catalog-content\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.617695 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-utilities\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.617736 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-catalog-content\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.634800 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-h6bmp\" (UniqueName: \"kubernetes.io/projected/c841ff71-e657-49fb-9390-720eed2b135c-kube-api-access-h6bmp\") pod \"redhat-marketplace-2s4dx\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:02 crc kubenswrapper[4812]: I0131 04:43:02.697745 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.144873 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s4dx"] Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.669216 4812 generic.go:334] "Generic (PLEG): container finished" podID="c841ff71-e657-49fb-9390-720eed2b135c" containerID="3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279" exitCode=0 Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.669321 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerDied","Data":"3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279"} Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.669461 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerStarted","Data":"47ed139c2bc1425d40f3edc27b074a455c70994adf1144233f40a4ea44cf5f82"} Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.879057 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm"] Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.879759 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.881585 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-dockercfg-zfgmt" Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.883693 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"keystone-operator-controller-manager-service-cert" Jan 31 04:43:03 crc kubenswrapper[4812]: I0131 04:43:03.899875 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm"] Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.034255 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-apiservice-cert\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.034334 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xfbzt\" (UniqueName: \"kubernetes.io/projected/bd8a58fc-0e84-4670-998e-a615bb248ff4-kube-api-access-xfbzt\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.034363 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-webhook-cert\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.135697 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-apiservice-cert\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.135817 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xfbzt\" (UniqueName: \"kubernetes.io/projected/bd8a58fc-0e84-4670-998e-a615bb248ff4-kube-api-access-xfbzt\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.135870 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-webhook-cert\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.142681 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-webhook-cert\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.142692 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-apiservice-cert\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.164328 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xfbzt\" (UniqueName: \"kubernetes.io/projected/bd8a58fc-0e84-4670-998e-a615bb248ff4-kube-api-access-xfbzt\") pod \"keystone-operator-controller-manager-c77b796c8-kvlqm\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.194131 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.619943 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm"] Jan 31 04:43:04 crc kubenswrapper[4812]: W0131 04:43:04.626883 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd8a58fc_0e84_4670_998e_a615bb248ff4.slice/crio-477d4ca55539926b6b212b61f5c6cd591d452349235bae6cb164474a21aabc3a WatchSource:0}: Error finding container 477d4ca55539926b6b212b61f5c6cd591d452349235bae6cb164474a21aabc3a: Status 404 returned error can't find the container with id 477d4ca55539926b6b212b61f5c6cd591d452349235bae6cb164474a21aabc3a Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.676552 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" event={"ID":"bd8a58fc-0e84-4670-998e-a615bb248ff4","Type":"ContainerStarted","Data":"477d4ca55539926b6b212b61f5c6cd591d452349235bae6cb164474a21aabc3a"} Jan 31 04:43:04 crc kubenswrapper[4812]: I0131 04:43:04.679104 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerStarted","Data":"711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9"} Jan 31 04:43:05 crc kubenswrapper[4812]: I0131 04:43:05.686267 4812 generic.go:334] "Generic (PLEG): container finished" podID="c841ff71-e657-49fb-9390-720eed2b135c" containerID="711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9" exitCode=0 Jan 31 04:43:05 crc kubenswrapper[4812]: I0131 04:43:05.686349 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerDied","Data":"711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9"} Jan 31 04:43:08 crc kubenswrapper[4812]: I0131 04:43:08.715991 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerStarted","Data":"71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8"} Jan 31 04:43:08 crc kubenswrapper[4812]: I0131 04:43:08.718109 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" event={"ID":"bd8a58fc-0e84-4670-998e-a615bb248ff4","Type":"ContainerStarted","Data":"0c99ce29918cee64610786b51cfe485eb830675b8765f10622a13f85b783e46d"} Jan 31 04:43:08 crc kubenswrapper[4812]: I0131 04:43:08.718351 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:08 crc kubenswrapper[4812]: I0131 04:43:08.737333 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-2s4dx" podStartSLOduration=2.490931473 podStartE2EDuration="6.73731618s" podCreationTimestamp="2026-01-31 04:43:02 +0000 UTC" firstStartedPulling="2026-01-31 04:43:03.670694821 +0000 UTC m=+992.165716486" lastFinishedPulling="2026-01-31 04:43:07.917079538 +0000 UTC m=+996.412101193" observedRunningTime="2026-01-31 04:43:08.733945769 +0000 UTC m=+997.228967464" watchObservedRunningTime="2026-01-31 04:43:08.73731618 +0000 UTC m=+997.232337845" Jan 31 04:43:08 crc kubenswrapper[4812]: I0131 04:43:08.754130 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" podStartSLOduration=2.452467984 podStartE2EDuration="5.754108622s" podCreationTimestamp="2026-01-31 04:43:03 +0000 UTC" firstStartedPulling="2026-01-31 04:43:04.629148978 +0000 UTC m=+993.124170643" lastFinishedPulling="2026-01-31 04:43:07.930789616 +0000 UTC m=+996.425811281" observedRunningTime="2026-01-31 04:43:08.753691501 +0000 UTC m=+997.248713196" watchObservedRunningTime="2026-01-31 04:43:08.754108622 +0000 UTC m=+997.249130297" Jan 31 04:43:09 crc kubenswrapper[4812]: I0131 04:43:09.727010 4812 generic.go:334] "Generic (PLEG): container finished" podID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerID="26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072" exitCode=0 Jan 31 04:43:09 crc kubenswrapper[4812]: I0131 04:43:09.727112 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"9ee32a7e-691a-4b75-b7ae-e32b64c41b36","Type":"ContainerDied","Data":"26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072"} Jan 31 04:43:10 crc kubenswrapper[4812]: I0131 04:43:10.736717 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"9ee32a7e-691a-4b75-b7ae-e32b64c41b36","Type":"ContainerStarted","Data":"086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d"} Jan 31 04:43:10 crc kubenswrapper[4812]: I0131 04:43:10.737295 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:43:10 crc kubenswrapper[4812]: I0131 04:43:10.762358 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/rabbitmq-server-0" podStartSLOduration=37.004760443 podStartE2EDuration="43.762343879s" podCreationTimestamp="2026-01-31 04:42:27 +0000 UTC" firstStartedPulling="2026-01-31 04:42:29.109357776 +0000 UTC m=+957.604379451" lastFinishedPulling="2026-01-31 04:42:35.866941222 +0000 UTC m=+964.361962887" observedRunningTime="2026-01-31 04:43:10.759658246 +0000 UTC m=+999.254679921" watchObservedRunningTime="2026-01-31 04:43:10.762343879 +0000 UTC m=+999.257365534" Jan 31 04:43:12 crc kubenswrapper[4812]: I0131 04:43:12.699033 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:12 crc kubenswrapper[4812]: I0131 04:43:12.699365 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:12 crc kubenswrapper[4812]: I0131 04:43:12.771092 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:14 crc kubenswrapper[4812]: I0131 04:43:14.200494 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:43:14 crc kubenswrapper[4812]: I0131 04:43:14.338653 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:43:14 crc kubenswrapper[4812]: I0131 04:43:14.338733 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.293489 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8"] Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.295066 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.298171 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-db-secret" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.303355 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-create-6m4g7"] Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.304510 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.314964 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8"] Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.324664 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-6m4g7"] Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.431469 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6a9acd-6420-45da-9853-b06d5d4b053b-operator-scripts\") pod \"keystone-0ca8-account-create-update-dgwc8\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.431813 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fp59z\" (UniqueName: \"kubernetes.io/projected/8c6a9acd-6420-45da-9853-b06d5d4b053b-kube-api-access-fp59z\") pod \"keystone-0ca8-account-create-update-dgwc8\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.431942 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5k8g\" (UniqueName: \"kubernetes.io/projected/95dc3255-b981-404f-8f25-641f12db9e86-kube-api-access-f5k8g\") pod \"keystone-db-create-6m4g7\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.432260 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95dc3255-b981-404f-8f25-641f12db9e86-operator-scripts\") pod \"keystone-db-create-6m4g7\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.534449 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95dc3255-b981-404f-8f25-641f12db9e86-operator-scripts\") pod \"keystone-db-create-6m4g7\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.534613 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6a9acd-6420-45da-9853-b06d5d4b053b-operator-scripts\") pod \"keystone-0ca8-account-create-update-dgwc8\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.534677 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fp59z\" (UniqueName: \"kubernetes.io/projected/8c6a9acd-6420-45da-9853-b06d5d4b053b-kube-api-access-fp59z\") pod \"keystone-0ca8-account-create-update-dgwc8\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.534723 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-f5k8g\" (UniqueName: \"kubernetes.io/projected/95dc3255-b981-404f-8f25-641f12db9e86-kube-api-access-f5k8g\") pod \"keystone-db-create-6m4g7\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.536192 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6a9acd-6420-45da-9853-b06d5d4b053b-operator-scripts\") pod \"keystone-0ca8-account-create-update-dgwc8\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.536425 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95dc3255-b981-404f-8f25-641f12db9e86-operator-scripts\") pod \"keystone-db-create-6m4g7\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.557714 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5k8g\" (UniqueName: \"kubernetes.io/projected/95dc3255-b981-404f-8f25-641f12db9e86-kube-api-access-f5k8g\") pod \"keystone-db-create-6m4g7\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.564828 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fp59z\" (UniqueName: \"kubernetes.io/projected/8c6a9acd-6420-45da-9853-b06d5d4b053b-kube-api-access-fp59z\") pod \"keystone-0ca8-account-create-update-dgwc8\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.623943 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:18 crc kubenswrapper[4812]: I0131 04:43:18.633574 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.089114 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8"] Jan 31 04:43:19 crc kubenswrapper[4812]: W0131 04:43:19.096525 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8c6a9acd_6420_45da_9853_b06d5d4b053b.slice/crio-28c588aa3ca95556a871f1d03f20083710b6b1d8784c5094f269262865d9fda0 WatchSource:0}: Error finding container 28c588aa3ca95556a871f1d03f20083710b6b1d8784c5094f269262865d9fda0: Status 404 returned error can't find the container with id 28c588aa3ca95556a871f1d03f20083710b6b1d8784c5094f269262865d9fda0 Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.158579 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-create-6m4g7"] Jan 31 04:43:19 crc kubenswrapper[4812]: W0131 04:43:19.182342 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod95dc3255_b981_404f_8f25_641f12db9e86.slice/crio-80d1786274e82a553a52a25a8cac1bf397383ba8cf2825a490ae03981abaac58 WatchSource:0}: Error finding container 80d1786274e82a553a52a25a8cac1bf397383ba8cf2825a490ae03981abaac58: Status 404 returned error can't find the container with id 80d1786274e82a553a52a25a8cac1bf397383ba8cf2825a490ae03981abaac58 Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.804758 4812 generic.go:334] "Generic (PLEG): container finished" podID="8c6a9acd-6420-45da-9853-b06d5d4b053b" containerID="c399af486fae32254458fc1ec54740d3ecf0dfdaef20ac6e6ad6148d45b2bcce" exitCode=0 Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.804822 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" event={"ID":"8c6a9acd-6420-45da-9853-b06d5d4b053b","Type":"ContainerDied","Data":"c399af486fae32254458fc1ec54740d3ecf0dfdaef20ac6e6ad6148d45b2bcce"} Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.804898 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" event={"ID":"8c6a9acd-6420-45da-9853-b06d5d4b053b","Type":"ContainerStarted","Data":"28c588aa3ca95556a871f1d03f20083710b6b1d8784c5094f269262865d9fda0"} Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.806221 4812 generic.go:334] "Generic (PLEG): container finished" podID="95dc3255-b981-404f-8f25-641f12db9e86" containerID="27314d412b708c5ca257551340b90c82ea8fe6ec1679257cc9035ffd320bec1b" exitCode=0 Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.806249 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-6m4g7" event={"ID":"95dc3255-b981-404f-8f25-641f12db9e86","Type":"ContainerDied","Data":"27314d412b708c5ca257551340b90c82ea8fe6ec1679257cc9035ffd320bec1b"} Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.806263 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-6m4g7" event={"ID":"95dc3255-b981-404f-8f25-641f12db9e86","Type":"ContainerStarted","Data":"80d1786274e82a553a52a25a8cac1bf397383ba8cf2825a490ae03981abaac58"} Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.969097 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-index-dggq6"] Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.970012 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.972008 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-index-dockercfg-4xjv2" Jan 31 04:43:19 crc kubenswrapper[4812]: I0131 04:43:19.984103 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-dggq6"] Jan 31 04:43:20 crc kubenswrapper[4812]: I0131 04:43:20.056161 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8lnxg\" (UniqueName: \"kubernetes.io/projected/bdfad5bb-7984-4248-9598-0319bb4543e0-kube-api-access-8lnxg\") pod \"horizon-operator-index-dggq6\" (UID: \"bdfad5bb-7984-4248-9598-0319bb4543e0\") " pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:20 crc kubenswrapper[4812]: I0131 04:43:20.158719 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8lnxg\" (UniqueName: \"kubernetes.io/projected/bdfad5bb-7984-4248-9598-0319bb4543e0-kube-api-access-8lnxg\") pod \"horizon-operator-index-dggq6\" (UID: \"bdfad5bb-7984-4248-9598-0319bb4543e0\") " pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:20 crc kubenswrapper[4812]: I0131 04:43:20.188688 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8lnxg\" (UniqueName: \"kubernetes.io/projected/bdfad5bb-7984-4248-9598-0319bb4543e0-kube-api-access-8lnxg\") pod \"horizon-operator-index-dggq6\" (UID: \"bdfad5bb-7984-4248-9598-0319bb4543e0\") " pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:20 crc kubenswrapper[4812]: I0131 04:43:20.283722 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:20 crc kubenswrapper[4812]: I0131 04:43:20.741284 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-index-dggq6"] Jan 31 04:43:20 crc kubenswrapper[4812]: I0131 04:43:20.814126 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-dggq6" event={"ID":"bdfad5bb-7984-4248-9598-0319bb4543e0","Type":"ContainerStarted","Data":"05e59cbb13c2f35a5f63840db7a1c94b9fe14b1878085228debf8d49e5d5d572"} Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.190628 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.194887 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.287025 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95dc3255-b981-404f-8f25-641f12db9e86-operator-scripts\") pod \"95dc3255-b981-404f-8f25-641f12db9e86\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.287072 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6a9acd-6420-45da-9853-b06d5d4b053b-operator-scripts\") pod \"8c6a9acd-6420-45da-9853-b06d5d4b053b\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.287106 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f5k8g\" (UniqueName: \"kubernetes.io/projected/95dc3255-b981-404f-8f25-641f12db9e86-kube-api-access-f5k8g\") pod \"95dc3255-b981-404f-8f25-641f12db9e86\" (UID: \"95dc3255-b981-404f-8f25-641f12db9e86\") " Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.287795 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/95dc3255-b981-404f-8f25-641f12db9e86-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "95dc3255-b981-404f-8f25-641f12db9e86" (UID: "95dc3255-b981-404f-8f25-641f12db9e86"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.287885 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8c6a9acd-6420-45da-9853-b06d5d4b053b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "8c6a9acd-6420-45da-9853-b06d5d4b053b" (UID: "8c6a9acd-6420-45da-9853-b06d5d4b053b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.287985 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fp59z\" (UniqueName: \"kubernetes.io/projected/8c6a9acd-6420-45da-9853-b06d5d4b053b-kube-api-access-fp59z\") pod \"8c6a9acd-6420-45da-9853-b06d5d4b053b\" (UID: \"8c6a9acd-6420-45da-9853-b06d5d4b053b\") " Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.288412 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/95dc3255-b981-404f-8f25-641f12db9e86-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.288438 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/8c6a9acd-6420-45da-9853-b06d5d4b053b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.291571 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/95dc3255-b981-404f-8f25-641f12db9e86-kube-api-access-f5k8g" (OuterVolumeSpecName: "kube-api-access-f5k8g") pod "95dc3255-b981-404f-8f25-641f12db9e86" (UID: "95dc3255-b981-404f-8f25-641f12db9e86"). InnerVolumeSpecName "kube-api-access-f5k8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.291621 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8c6a9acd-6420-45da-9853-b06d5d4b053b-kube-api-access-fp59z" (OuterVolumeSpecName: "kube-api-access-fp59z") pod "8c6a9acd-6420-45da-9853-b06d5d4b053b" (UID: "8c6a9acd-6420-45da-9853-b06d5d4b053b"). InnerVolumeSpecName "kube-api-access-fp59z". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.389561 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-f5k8g\" (UniqueName: \"kubernetes.io/projected/95dc3255-b981-404f-8f25-641f12db9e86-kube-api-access-f5k8g\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.389601 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fp59z\" (UniqueName: \"kubernetes.io/projected/8c6a9acd-6420-45da-9853-b06d5d4b053b-kube-api-access-fp59z\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.824055 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.824068 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8" event={"ID":"8c6a9acd-6420-45da-9853-b06d5d4b053b","Type":"ContainerDied","Data":"28c588aa3ca95556a871f1d03f20083710b6b1d8784c5094f269262865d9fda0"} Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.824118 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="28c588aa3ca95556a871f1d03f20083710b6b1d8784c5094f269262865d9fda0" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.826088 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-create-6m4g7" event={"ID":"95dc3255-b981-404f-8f25-641f12db9e86","Type":"ContainerDied","Data":"80d1786274e82a553a52a25a8cac1bf397383ba8cf2825a490ae03981abaac58"} Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.826126 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="80d1786274e82a553a52a25a8cac1bf397383ba8cf2825a490ae03981abaac58" Jan 31 04:43:21 crc kubenswrapper[4812]: I0131 04:43:21.826172 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-create-6m4g7" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.578876 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-index-ffcjk"] Jan 31 04:43:22 crc kubenswrapper[4812]: E0131 04:43:22.579342 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="95dc3255-b981-404f-8f25-641f12db9e86" containerName="mariadb-database-create" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.579386 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="95dc3255-b981-404f-8f25-641f12db9e86" containerName="mariadb-database-create" Jan 31 04:43:22 crc kubenswrapper[4812]: E0131 04:43:22.579427 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8c6a9acd-6420-45da-9853-b06d5d4b053b" containerName="mariadb-account-create-update" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.579447 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8c6a9acd-6420-45da-9853-b06d5d4b053b" containerName="mariadb-account-create-update" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.579752 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8c6a9acd-6420-45da-9853-b06d5d4b053b" containerName="mariadb-account-create-update" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.579800 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="95dc3255-b981-404f-8f25-641f12db9e86" containerName="mariadb-database-create" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.580731 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.584081 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-index-dockercfg-6fztn" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.590584 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-ffcjk"] Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.711157 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w9fvb\" (UniqueName: \"kubernetes.io/projected/ea90fec5-c635-4a58-9cc4-ff55147e2c26-kube-api-access-w9fvb\") pod \"swift-operator-index-ffcjk\" (UID: \"ea90fec5-c635-4a58-9cc4-ff55147e2c26\") " pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.784304 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.816380 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w9fvb\" (UniqueName: \"kubernetes.io/projected/ea90fec5-c635-4a58-9cc4-ff55147e2c26-kube-api-access-w9fvb\") pod \"swift-operator-index-ffcjk\" (UID: \"ea90fec5-c635-4a58-9cc4-ff55147e2c26\") " pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.837562 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-index-dggq6" event={"ID":"bdfad5bb-7984-4248-9598-0319bb4543e0","Type":"ContainerStarted","Data":"985a12c4cdae11d831c45fc31be729c5f0f769fb99030dc13c0e84d1d157a01e"} Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.841049 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w9fvb\" (UniqueName: \"kubernetes.io/projected/ea90fec5-c635-4a58-9cc4-ff55147e2c26-kube-api-access-w9fvb\") pod \"swift-operator-index-ffcjk\" (UID: \"ea90fec5-c635-4a58-9cc4-ff55147e2c26\") " pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.857948 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-index-dggq6" podStartSLOduration=2.218005066 podStartE2EDuration="3.857922316s" podCreationTimestamp="2026-01-31 04:43:19 +0000 UTC" firstStartedPulling="2026-01-31 04:43:20.741510446 +0000 UTC m=+1009.236532111" lastFinishedPulling="2026-01-31 04:43:22.381427656 +0000 UTC m=+1010.876449361" observedRunningTime="2026-01-31 04:43:22.854874484 +0000 UTC m=+1011.349896179" watchObservedRunningTime="2026-01-31 04:43:22.857922316 +0000 UTC m=+1011.352944011" Jan 31 04:43:22 crc kubenswrapper[4812]: I0131 04:43:22.910452 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:23 crc kubenswrapper[4812]: I0131 04:43:23.369519 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-index-ffcjk"] Jan 31 04:43:23 crc kubenswrapper[4812]: W0131 04:43:23.378382 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podea90fec5_c635_4a58_9cc4_ff55147e2c26.slice/crio-b9a2cc0204aa5f0aa80ab71b647a91d0491e9777208a957fc0dbde58a431eed2 WatchSource:0}: Error finding container b9a2cc0204aa5f0aa80ab71b647a91d0491e9777208a957fc0dbde58a431eed2: Status 404 returned error can't find the container with id b9a2cc0204aa5f0aa80ab71b647a91d0491e9777208a957fc0dbde58a431eed2 Jan 31 04:43:23 crc kubenswrapper[4812]: I0131 04:43:23.845871 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-ffcjk" event={"ID":"ea90fec5-c635-4a58-9cc4-ff55147e2c26","Type":"ContainerStarted","Data":"b9a2cc0204aa5f0aa80ab71b647a91d0491e9777208a957fc0dbde58a431eed2"} Jan 31 04:43:25 crc kubenswrapper[4812]: I0131 04:43:25.863986 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-ffcjk" event={"ID":"ea90fec5-c635-4a58-9cc4-ff55147e2c26","Type":"ContainerStarted","Data":"394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470"} Jan 31 04:43:25 crc kubenswrapper[4812]: I0131 04:43:25.879768 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-index-ffcjk" podStartSLOduration=1.992614332 podStartE2EDuration="3.879747945s" podCreationTimestamp="2026-01-31 04:43:22 +0000 UTC" firstStartedPulling="2026-01-31 04:43:23.381944707 +0000 UTC m=+1011.876966422" lastFinishedPulling="2026-01-31 04:43:25.26907837 +0000 UTC m=+1013.764100035" observedRunningTime="2026-01-31 04:43:25.877274168 +0000 UTC m=+1014.372295873" watchObservedRunningTime="2026-01-31 04:43:25.879747945 +0000 UTC m=+1014.374769620" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.165496 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s4dx"] Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.166277 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-2s4dx" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="registry-server" containerID="cri-o://71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8" gracePeriod=2 Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.615220 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.623878 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.729381 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-utilities\") pod \"c841ff71-e657-49fb-9390-720eed2b135c\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.729516 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-catalog-content\") pod \"c841ff71-e657-49fb-9390-720eed2b135c\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.729613 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h6bmp\" (UniqueName: \"kubernetes.io/projected/c841ff71-e657-49fb-9390-720eed2b135c-kube-api-access-h6bmp\") pod \"c841ff71-e657-49fb-9390-720eed2b135c\" (UID: \"c841ff71-e657-49fb-9390-720eed2b135c\") " Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.731180 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-utilities" (OuterVolumeSpecName: "utilities") pod "c841ff71-e657-49fb-9390-720eed2b135c" (UID: "c841ff71-e657-49fb-9390-720eed2b135c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.756112 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c841ff71-e657-49fb-9390-720eed2b135c-kube-api-access-h6bmp" (OuterVolumeSpecName: "kube-api-access-h6bmp") pod "c841ff71-e657-49fb-9390-720eed2b135c" (UID: "c841ff71-e657-49fb-9390-720eed2b135c"). InnerVolumeSpecName "kube-api-access-h6bmp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.771276 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "c841ff71-e657-49fb-9390-720eed2b135c" (UID: "c841ff71-e657-49fb-9390-720eed2b135c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.831683 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-h6bmp\" (UniqueName: \"kubernetes.io/projected/c841ff71-e657-49fb-9390-720eed2b135c-kube-api-access-h6bmp\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.831715 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.831725 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/c841ff71-e657-49fb-9390-720eed2b135c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.882069 4812 generic.go:334] "Generic (PLEG): container finished" podID="c841ff71-e657-49fb-9390-720eed2b135c" containerID="71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8" exitCode=0 Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.882125 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerDied","Data":"71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8"} Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.882132 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-2s4dx" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.882166 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-2s4dx" event={"ID":"c841ff71-e657-49fb-9390-720eed2b135c","Type":"ContainerDied","Data":"47ed139c2bc1425d40f3edc27b074a455c70994adf1144233f40a4ea44cf5f82"} Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.882194 4812 scope.go:117] "RemoveContainer" containerID="71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.900256 4812 scope.go:117] "RemoveContainer" containerID="711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.923501 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s4dx"] Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.929784 4812 scope.go:117] "RemoveContainer" containerID="3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.931976 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-2s4dx"] Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.951787 4812 scope.go:117] "RemoveContainer" containerID="71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8" Jan 31 04:43:28 crc kubenswrapper[4812]: E0131 04:43:28.954069 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8\": container with ID starting with 71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8 not found: ID does not exist" containerID="71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.954118 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8"} err="failed to get container status \"71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8\": rpc error: code = NotFound desc = could not find container \"71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8\": container with ID starting with 71d956b983c266f2fe3f21dc9366bf08f40e9212ec1d718b52d8d9a74064e6d8 not found: ID does not exist" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.954144 4812 scope.go:117] "RemoveContainer" containerID="711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9" Jan 31 04:43:28 crc kubenswrapper[4812]: E0131 04:43:28.954497 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9\": container with ID starting with 711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9 not found: ID does not exist" containerID="711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.954538 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9"} err="failed to get container status \"711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9\": rpc error: code = NotFound desc = could not find container \"711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9\": container with ID starting with 711ea4f8cf5e7bd95c85efcb04a4c829dcc5b7117ced25cd1758d3cb0d33baf9 not found: ID does not exist" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.954565 4812 scope.go:117] "RemoveContainer" containerID="3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279" Jan 31 04:43:28 crc kubenswrapper[4812]: E0131 04:43:28.955005 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279\": container with ID starting with 3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279 not found: ID does not exist" containerID="3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279" Jan 31 04:43:28 crc kubenswrapper[4812]: I0131 04:43:28.955029 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279"} err="failed to get container status \"3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279\": rpc error: code = NotFound desc = could not find container \"3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279\": container with ID starting with 3a6e70b4c707b96327d5fff4b3f92747e8f62c5c3ec47a46d54ffe1cab3f2279 not found: ID does not exist" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.182323 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-db-sync-pdfkq"] Jan 31 04:43:29 crc kubenswrapper[4812]: E0131 04:43:29.183722 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="extract-utilities" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.183813 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="extract-utilities" Jan 31 04:43:29 crc kubenswrapper[4812]: E0131 04:43:29.183920 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="extract-content" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.184016 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="extract-content" Jan 31 04:43:29 crc kubenswrapper[4812]: E0131 04:43:29.184092 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="registry-server" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.184157 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="registry-server" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.184361 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c841ff71-e657-49fb-9390-720eed2b135c" containerName="registry-server" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.184953 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.187269 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.187397 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.187824 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-hcttt" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.188372 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.192489 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-pdfkq"] Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.338157 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b8de02-d057-468f-9df6-b18f83bc6dbe-config-data\") pod \"keystone-db-sync-pdfkq\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.338271 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk6lp\" (UniqueName: \"kubernetes.io/projected/c5b8de02-d057-468f-9df6-b18f83bc6dbe-kube-api-access-vk6lp\") pod \"keystone-db-sync-pdfkq\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.440193 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b8de02-d057-468f-9df6-b18f83bc6dbe-config-data\") pod \"keystone-db-sync-pdfkq\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.440588 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vk6lp\" (UniqueName: \"kubernetes.io/projected/c5b8de02-d057-468f-9df6-b18f83bc6dbe-kube-api-access-vk6lp\") pod \"keystone-db-sync-pdfkq\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.444544 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b8de02-d057-468f-9df6-b18f83bc6dbe-config-data\") pod \"keystone-db-sync-pdfkq\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.460915 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk6lp\" (UniqueName: \"kubernetes.io/projected/c5b8de02-d057-468f-9df6-b18f83bc6dbe-kube-api-access-vk6lp\") pod \"keystone-db-sync-pdfkq\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.507306 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.758744 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-pdfkq"] Jan 31 04:43:29 crc kubenswrapper[4812]: I0131 04:43:29.891268 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" event={"ID":"c5b8de02-d057-468f-9df6-b18f83bc6dbe","Type":"ContainerStarted","Data":"5e9e5e5a9f10a38f614fc64156f57cd84059549bca6751a215d4db4008da0a3b"} Jan 31 04:43:30 crc kubenswrapper[4812]: I0131 04:43:30.287064 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:30 crc kubenswrapper[4812]: I0131 04:43:30.287182 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:30 crc kubenswrapper[4812]: I0131 04:43:30.317920 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:30 crc kubenswrapper[4812]: I0131 04:43:30.347967 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c841ff71-e657-49fb-9390-720eed2b135c" path="/var/lib/kubelet/pods/c841ff71-e657-49fb-9390-720eed2b135c/volumes" Jan 31 04:43:30 crc kubenswrapper[4812]: I0131 04:43:30.922858 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-index-dggq6" Jan 31 04:43:32 crc kubenswrapper[4812]: I0131 04:43:32.911878 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:32 crc kubenswrapper[4812]: I0131 04:43:32.912244 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:32 crc kubenswrapper[4812]: I0131 04:43:32.937806 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:33 crc kubenswrapper[4812]: I0131 04:43:33.943995 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.415463 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw"] Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.417485 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.421006 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sklpv" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.424440 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw"] Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.520368 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkg2q\" (UniqueName: \"kubernetes.io/projected/e2a060a9-1558-46fd-b8eb-8b72af226b1f-kube-api-access-dkg2q\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.520523 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-bundle\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.520732 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-util\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.621375 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-util\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.621460 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dkg2q\" (UniqueName: \"kubernetes.io/projected/e2a060a9-1558-46fd-b8eb-8b72af226b1f-kube-api-access-dkg2q\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.621494 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-bundle\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.621877 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-bundle\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.621885 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-util\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.651299 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkg2q\" (UniqueName: \"kubernetes.io/projected/e2a060a9-1558-46fd-b8eb-8b72af226b1f-kube-api-access-dkg2q\") pod \"70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:39 crc kubenswrapper[4812]: I0131 04:43:39.736421 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.201230 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld"] Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.206714 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.213718 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld"] Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.229548 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sncwj\" (UniqueName: \"kubernetes.io/projected/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-kube-api-access-sncwj\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.229596 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-util\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.229673 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-bundle\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.330729 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-bundle\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.330792 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-sncwj\" (UniqueName: \"kubernetes.io/projected/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-kube-api-access-sncwj\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.330833 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-util\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.331472 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-util\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.331648 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-bundle\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.349598 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-sncwj\" (UniqueName: \"kubernetes.io/projected/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-kube-api-access-sncwj\") pod \"920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:40 crc kubenswrapper[4812]: I0131 04:43:40.524480 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.127153 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld"] Jan 31 04:43:41 crc kubenswrapper[4812]: W0131 04:43:41.138616 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod398de10e_71b7_41cb_ac1c_b2d5f8fffa75.slice/crio-1478a0de8602519a80127051d92f2f6be8683b42fa434c97c03166a3e8891e25 WatchSource:0}: Error finding container 1478a0de8602519a80127051d92f2f6be8683b42fa434c97c03166a3e8891e25: Status 404 returned error can't find the container with id 1478a0de8602519a80127051d92f2f6be8683b42fa434c97c03166a3e8891e25 Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.236791 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw"] Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.989454 4812 generic.go:334] "Generic (PLEG): container finished" podID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerID="588d3df720bb535deffc0afc51252bc5957c252cd93cf984553af9c5c8cce071" exitCode=0 Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.989544 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" event={"ID":"398de10e-71b7-41cb-ac1c-b2d5f8fffa75","Type":"ContainerDied","Data":"588d3df720bb535deffc0afc51252bc5957c252cd93cf984553af9c5c8cce071"} Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.989825 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" event={"ID":"398de10e-71b7-41cb-ac1c-b2d5f8fffa75","Type":"ContainerStarted","Data":"1478a0de8602519a80127051d92f2f6be8683b42fa434c97c03166a3e8891e25"} Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.991900 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" event={"ID":"c5b8de02-d057-468f-9df6-b18f83bc6dbe","Type":"ContainerStarted","Data":"94ec430977786d84f20f896cd6ed99d48f9ab9a151000b0cf26539f8fcd1e9cf"} Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.993554 4812 generic.go:334] "Generic (PLEG): container finished" podID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerID="e52c41cd53f076fe1c1fd277272600b69fd0fc22d5ca3dadd963169d93ded8b0" exitCode=0 Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.993582 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" event={"ID":"e2a060a9-1558-46fd-b8eb-8b72af226b1f","Type":"ContainerDied","Data":"e52c41cd53f076fe1c1fd277272600b69fd0fc22d5ca3dadd963169d93ded8b0"} Jan 31 04:43:41 crc kubenswrapper[4812]: I0131 04:43:41.993597 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" event={"ID":"e2a060a9-1558-46fd-b8eb-8b72af226b1f","Type":"ContainerStarted","Data":"ca0ed7097ffd0b20b03b439370a0f5488a8c85386a0185f36fd5447bc3fc48c6"} Jan 31 04:43:42 crc kubenswrapper[4812]: I0131 04:43:42.041680 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" podStartSLOduration=1.796854696 podStartE2EDuration="13.041661678s" podCreationTimestamp="2026-01-31 04:43:29 +0000 UTC" firstStartedPulling="2026-01-31 04:43:29.76945283 +0000 UTC m=+1018.264474495" lastFinishedPulling="2026-01-31 04:43:41.014259812 +0000 UTC m=+1029.509281477" observedRunningTime="2026-01-31 04:43:42.041075261 +0000 UTC m=+1030.536096936" watchObservedRunningTime="2026-01-31 04:43:42.041661678 +0000 UTC m=+1030.536683343" Jan 31 04:43:44 crc kubenswrapper[4812]: I0131 04:43:44.337960 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:43:44 crc kubenswrapper[4812]: I0131 04:43:44.338324 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:43:44 crc kubenswrapper[4812]: I0131 04:43:44.338398 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:43:44 crc kubenswrapper[4812]: I0131 04:43:44.339247 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1cec58e47f0a7d8b5dd4b0d6f39c2bf781f061b2945554dc684cf3bd87d589d3"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:43:44 crc kubenswrapper[4812]: I0131 04:43:44.339316 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://1cec58e47f0a7d8b5dd4b0d6f39c2bf781f061b2945554dc684cf3bd87d589d3" gracePeriod=600 Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.019956 4812 generic.go:334] "Generic (PLEG): container finished" podID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerID="a9a5b731319dd8729038bc05b80dcaa9557a5c7d753d1e0ed8524290308db031" exitCode=0 Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.020077 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" event={"ID":"e2a060a9-1558-46fd-b8eb-8b72af226b1f","Type":"ContainerDied","Data":"a9a5b731319dd8729038bc05b80dcaa9557a5c7d753d1e0ed8524290308db031"} Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.029490 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="1cec58e47f0a7d8b5dd4b0d6f39c2bf781f061b2945554dc684cf3bd87d589d3" exitCode=0 Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.029592 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"1cec58e47f0a7d8b5dd4b0d6f39c2bf781f061b2945554dc684cf3bd87d589d3"} Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.029663 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"3941ab2783314337d608871e34efeae041c8fd21a85db625d2cda280e4cba1e2"} Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.029689 4812 scope.go:117] "RemoveContainer" containerID="8eec189e5f64e5907eb85688b79b50a9ecc03fb99c8c2ed8b673292e41a75382" Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.036083 4812 generic.go:334] "Generic (PLEG): container finished" podID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerID="ffaec29a9c04cb8629899f6ea2a1f7e35eaadec4c841aa059fa20bdfa378fdc2" exitCode=0 Jan 31 04:43:45 crc kubenswrapper[4812]: I0131 04:43:45.036142 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" event={"ID":"398de10e-71b7-41cb-ac1c-b2d5f8fffa75","Type":"ContainerDied","Data":"ffaec29a9c04cb8629899f6ea2a1f7e35eaadec4c841aa059fa20bdfa378fdc2"} Jan 31 04:43:46 crc kubenswrapper[4812]: I0131 04:43:46.045997 4812 generic.go:334] "Generic (PLEG): container finished" podID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerID="dec5444b1215453156339665e2f31518887e45ccf575b66dd88cd1f8e8c95d5d" exitCode=0 Jan 31 04:43:46 crc kubenswrapper[4812]: I0131 04:43:46.046103 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" event={"ID":"e2a060a9-1558-46fd-b8eb-8b72af226b1f","Type":"ContainerDied","Data":"dec5444b1215453156339665e2f31518887e45ccf575b66dd88cd1f8e8c95d5d"} Jan 31 04:43:46 crc kubenswrapper[4812]: I0131 04:43:46.053510 4812 generic.go:334] "Generic (PLEG): container finished" podID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerID="d4396d03ae61231a0b14bcfd81fb97311a22b8d17efdb147f9a471097f262a51" exitCode=0 Jan 31 04:43:46 crc kubenswrapper[4812]: I0131 04:43:46.053611 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" event={"ID":"398de10e-71b7-41cb-ac1c-b2d5f8fffa75","Type":"ContainerDied","Data":"d4396d03ae61231a0b14bcfd81fb97311a22b8d17efdb147f9a471097f262a51"} Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.064872 4812 generic.go:334] "Generic (PLEG): container finished" podID="c5b8de02-d057-468f-9df6-b18f83bc6dbe" containerID="94ec430977786d84f20f896cd6ed99d48f9ab9a151000b0cf26539f8fcd1e9cf" exitCode=0 Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.064938 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" event={"ID":"c5b8de02-d057-468f-9df6-b18f83bc6dbe","Type":"ContainerDied","Data":"94ec430977786d84f20f896cd6ed99d48f9ab9a151000b0cf26539f8fcd1e9cf"} Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.384885 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.389494 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.533365 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sncwj\" (UniqueName: \"kubernetes.io/projected/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-kube-api-access-sncwj\") pod \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.533418 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-util\") pod \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.533445 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-bundle\") pod \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.533476 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-bundle\") pod \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.533522 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkg2q\" (UniqueName: \"kubernetes.io/projected/e2a060a9-1558-46fd-b8eb-8b72af226b1f-kube-api-access-dkg2q\") pod \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\" (UID: \"e2a060a9-1558-46fd-b8eb-8b72af226b1f\") " Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.533567 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-util\") pod \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\" (UID: \"398de10e-71b7-41cb-ac1c-b2d5f8fffa75\") " Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.534321 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-bundle" (OuterVolumeSpecName: "bundle") pod "398de10e-71b7-41cb-ac1c-b2d5f8fffa75" (UID: "398de10e-71b7-41cb-ac1c-b2d5f8fffa75"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.534826 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-bundle" (OuterVolumeSpecName: "bundle") pod "e2a060a9-1558-46fd-b8eb-8b72af226b1f" (UID: "e2a060a9-1558-46fd-b8eb-8b72af226b1f"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.538616 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e2a060a9-1558-46fd-b8eb-8b72af226b1f-kube-api-access-dkg2q" (OuterVolumeSpecName: "kube-api-access-dkg2q") pod "e2a060a9-1558-46fd-b8eb-8b72af226b1f" (UID: "e2a060a9-1558-46fd-b8eb-8b72af226b1f"). InnerVolumeSpecName "kube-api-access-dkg2q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.539983 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-kube-api-access-sncwj" (OuterVolumeSpecName: "kube-api-access-sncwj") pod "398de10e-71b7-41cb-ac1c-b2d5f8fffa75" (UID: "398de10e-71b7-41cb-ac1c-b2d5f8fffa75"). InnerVolumeSpecName "kube-api-access-sncwj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.549207 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-util" (OuterVolumeSpecName: "util") pod "398de10e-71b7-41cb-ac1c-b2d5f8fffa75" (UID: "398de10e-71b7-41cb-ac1c-b2d5f8fffa75"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.635555 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dkg2q\" (UniqueName: \"kubernetes.io/projected/e2a060a9-1558-46fd-b8eb-8b72af226b1f-kube-api-access-dkg2q\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.635610 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.635628 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-sncwj\" (UniqueName: \"kubernetes.io/projected/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-kube-api-access-sncwj\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.635672 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.635696 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/398de10e-71b7-41cb-ac1c-b2d5f8fffa75-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.667888 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-util" (OuterVolumeSpecName: "util") pod "e2a060a9-1558-46fd-b8eb-8b72af226b1f" (UID: "e2a060a9-1558-46fd-b8eb-8b72af226b1f"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:43:47 crc kubenswrapper[4812]: I0131 04:43:47.737149 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/e2a060a9-1558-46fd-b8eb-8b72af226b1f-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.076441 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.076444 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw" event={"ID":"e2a060a9-1558-46fd-b8eb-8b72af226b1f","Type":"ContainerDied","Data":"ca0ed7097ffd0b20b03b439370a0f5488a8c85386a0185f36fd5447bc3fc48c6"} Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.076606 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="ca0ed7097ffd0b20b03b439370a0f5488a8c85386a0185f36fd5447bc3fc48c6" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.079495 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" event={"ID":"398de10e-71b7-41cb-ac1c-b2d5f8fffa75","Type":"ContainerDied","Data":"1478a0de8602519a80127051d92f2f6be8683b42fa434c97c03166a3e8891e25"} Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.079522 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="1478a0de8602519a80127051d92f2f6be8683b42fa434c97c03166a3e8891e25" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.079566 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.427305 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.549035 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk6lp\" (UniqueName: \"kubernetes.io/projected/c5b8de02-d057-468f-9df6-b18f83bc6dbe-kube-api-access-vk6lp\") pod \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.549213 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b8de02-d057-468f-9df6-b18f83bc6dbe-config-data\") pod \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\" (UID: \"c5b8de02-d057-468f-9df6-b18f83bc6dbe\") " Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.555168 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c5b8de02-d057-468f-9df6-b18f83bc6dbe-kube-api-access-vk6lp" (OuterVolumeSpecName: "kube-api-access-vk6lp") pod "c5b8de02-d057-468f-9df6-b18f83bc6dbe" (UID: "c5b8de02-d057-468f-9df6-b18f83bc6dbe"). InnerVolumeSpecName "kube-api-access-vk6lp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.595539 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c5b8de02-d057-468f-9df6-b18f83bc6dbe-config-data" (OuterVolumeSpecName: "config-data") pod "c5b8de02-d057-468f-9df6-b18f83bc6dbe" (UID: "c5b8de02-d057-468f-9df6-b18f83bc6dbe"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.650624 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c5b8de02-d057-468f-9df6-b18f83bc6dbe-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:48 crc kubenswrapper[4812]: I0131 04:43:48.650818 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vk6lp\" (UniqueName: \"kubernetes.io/projected/c5b8de02-d057-468f-9df6-b18f83bc6dbe-kube-api-access-vk6lp\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.089515 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" event={"ID":"c5b8de02-d057-468f-9df6-b18f83bc6dbe","Type":"ContainerDied","Data":"5e9e5e5a9f10a38f614fc64156f57cd84059549bca6751a215d4db4008da0a3b"} Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.090289 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5e9e5e5a9f10a38f614fc64156f57cd84059549bca6751a215d4db4008da0a3b" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.089635 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-db-sync-pdfkq" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.280375 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-czvn5"] Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.281218 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="extract" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.281309 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="extract" Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.281401 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="util" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.281471 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="util" Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.281538 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="pull" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.281608 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="pull" Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.281682 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="pull" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.281754 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="pull" Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.281831 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="util" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.281930 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="util" Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.282007 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="extract" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.282083 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="extract" Jan 31 04:43:49 crc kubenswrapper[4812]: E0131 04:43:49.282154 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c5b8de02-d057-468f-9df6-b18f83bc6dbe" containerName="keystone-db-sync" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.282223 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c5b8de02-d057-468f-9df6-b18f83bc6dbe" containerName="keystone-db-sync" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.282435 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="398de10e-71b7-41cb-ac1c-b2d5f8fffa75" containerName="extract" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.282524 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c5b8de02-d057-468f-9df6-b18f83bc6dbe" containerName="keystone-db-sync" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.282606 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" containerName="extract" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.283162 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.287105 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"osp-secret" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.287276 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.287483 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-hcttt" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.287664 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.287768 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.293307 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-czvn5"] Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.462639 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-scripts\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.462999 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-credential-keys\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.463139 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lfm77\" (UniqueName: \"kubernetes.io/projected/a33f3025-e1cb-4935-b358-3badf3fda335-kube-api-access-lfm77\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.463371 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-config-data\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.463496 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-fernet-keys\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.564888 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-scripts\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.565270 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-credential-keys\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.565487 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lfm77\" (UniqueName: \"kubernetes.io/projected/a33f3025-e1cb-4935-b358-3badf3fda335-kube-api-access-lfm77\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.565715 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-fernet-keys\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.565914 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-config-data\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.572398 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-fernet-keys\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.572705 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-scripts\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.573703 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-credential-keys\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.574159 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-config-data\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.583538 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lfm77\" (UniqueName: \"kubernetes.io/projected/a33f3025-e1cb-4935-b358-3badf3fda335-kube-api-access-lfm77\") pod \"keystone-bootstrap-czvn5\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:49 crc kubenswrapper[4812]: I0131 04:43:49.611587 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:50 crc kubenswrapper[4812]: I0131 04:43:50.055652 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-czvn5"] Jan 31 04:43:50 crc kubenswrapper[4812]: W0131 04:43:50.060736 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda33f3025_e1cb_4935_b358_3badf3fda335.slice/crio-58936bb601df13180df9e22e2bdca3fe2d9a7286c615297bd242f6b48fd3a6a3 WatchSource:0}: Error finding container 58936bb601df13180df9e22e2bdca3fe2d9a7286c615297bd242f6b48fd3a6a3: Status 404 returned error can't find the container with id 58936bb601df13180df9e22e2bdca3fe2d9a7286c615297bd242f6b48fd3a6a3 Jan 31 04:43:50 crc kubenswrapper[4812]: I0131 04:43:50.102919 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" event={"ID":"a33f3025-e1cb-4935-b358-3badf3fda335","Type":"ContainerStarted","Data":"58936bb601df13180df9e22e2bdca3fe2d9a7286c615297bd242f6b48fd3a6a3"} Jan 31 04:43:51 crc kubenswrapper[4812]: I0131 04:43:51.115158 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" event={"ID":"a33f3025-e1cb-4935-b358-3badf3fda335","Type":"ContainerStarted","Data":"15ff3d3a4018d218ea8ff943bce08ba751225296dc5ab51db1be49868daf375d"} Jan 31 04:43:51 crc kubenswrapper[4812]: I0131 04:43:51.142725 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" podStartSLOduration=2.142703042 podStartE2EDuration="2.142703042s" podCreationTimestamp="2026-01-31 04:43:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:43:51.135686853 +0000 UTC m=+1039.630708548" watchObservedRunningTime="2026-01-31 04:43:51.142703042 +0000 UTC m=+1039.637724737" Jan 31 04:43:54 crc kubenswrapper[4812]: I0131 04:43:54.146688 4812 generic.go:334] "Generic (PLEG): container finished" podID="a33f3025-e1cb-4935-b358-3badf3fda335" containerID="15ff3d3a4018d218ea8ff943bce08ba751225296dc5ab51db1be49868daf375d" exitCode=0 Jan 31 04:43:54 crc kubenswrapper[4812]: I0131 04:43:54.146890 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" event={"ID":"a33f3025-e1cb-4935-b358-3badf3fda335","Type":"ContainerDied","Data":"15ff3d3a4018d218ea8ff943bce08ba751225296dc5ab51db1be49868daf375d"} Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.437001 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.548946 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-config-data\") pod \"a33f3025-e1cb-4935-b358-3badf3fda335\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.549066 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lfm77\" (UniqueName: \"kubernetes.io/projected/a33f3025-e1cb-4935-b358-3badf3fda335-kube-api-access-lfm77\") pod \"a33f3025-e1cb-4935-b358-3badf3fda335\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.549163 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-fernet-keys\") pod \"a33f3025-e1cb-4935-b358-3badf3fda335\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.549318 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-credential-keys\") pod \"a33f3025-e1cb-4935-b358-3badf3fda335\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.549396 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-scripts\") pod \"a33f3025-e1cb-4935-b358-3badf3fda335\" (UID: \"a33f3025-e1cb-4935-b358-3badf3fda335\") " Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.554491 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "a33f3025-e1cb-4935-b358-3badf3fda335" (UID: "a33f3025-e1cb-4935-b358-3badf3fda335"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.554732 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-scripts" (OuterVolumeSpecName: "scripts") pod "a33f3025-e1cb-4935-b358-3badf3fda335" (UID: "a33f3025-e1cb-4935-b358-3badf3fda335"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.556642 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a33f3025-e1cb-4935-b358-3badf3fda335-kube-api-access-lfm77" (OuterVolumeSpecName: "kube-api-access-lfm77") pod "a33f3025-e1cb-4935-b358-3badf3fda335" (UID: "a33f3025-e1cb-4935-b358-3badf3fda335"). InnerVolumeSpecName "kube-api-access-lfm77". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.575040 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "a33f3025-e1cb-4935-b358-3badf3fda335" (UID: "a33f3025-e1cb-4935-b358-3badf3fda335"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.577252 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-config-data" (OuterVolumeSpecName: "config-data") pod "a33f3025-e1cb-4935-b358-3badf3fda335" (UID: "a33f3025-e1cb-4935-b358-3badf3fda335"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.651094 4812 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.651350 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.651360 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.651370 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lfm77\" (UniqueName: \"kubernetes.io/projected/a33f3025-e1cb-4935-b358-3badf3fda335-kube-api-access-lfm77\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:55 crc kubenswrapper[4812]: I0131 04:43:55.651379 4812 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/a33f3025-e1cb-4935-b358-3badf3fda335-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.159876 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" event={"ID":"a33f3025-e1cb-4935-b358-3badf3fda335","Type":"ContainerDied","Data":"58936bb601df13180df9e22e2bdca3fe2d9a7286c615297bd242f6b48fd3a6a3"} Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.159929 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="58936bb601df13180df9e22e2bdca3fe2d9a7286c615297bd242f6b48fd3a6a3" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.159936 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-bootstrap-czvn5" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.254731 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone-5c49cbbfd-wfwms"] Jan 31 04:43:56 crc kubenswrapper[4812]: E0131 04:43:56.255059 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a33f3025-e1cb-4935-b358-3badf3fda335" containerName="keystone-bootstrap" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.255083 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a33f3025-e1cb-4935-b358-3badf3fda335" containerName="keystone-bootstrap" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.255234 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a33f3025-e1cb-4935-b358-3badf3fda335" containerName="keystone-bootstrap" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.256007 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.258872 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-scripts" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.259233 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-config-data" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.259481 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.259650 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"keystone-keystone-dockercfg-hcttt" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.268228 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-5c49cbbfd-wfwms"] Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.362290 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2ndp\" (UniqueName: \"kubernetes.io/projected/949a611c-00dc-4dac-9068-0dc00cf79572-kube-api-access-x2ndp\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.362363 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-credential-keys\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.362571 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-config-data\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.362651 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-fernet-keys\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.362677 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-scripts\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.464155 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-x2ndp\" (UniqueName: \"kubernetes.io/projected/949a611c-00dc-4dac-9068-0dc00cf79572-kube-api-access-x2ndp\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.464225 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-credential-keys\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.465068 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-config-data\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.465303 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-fernet-keys\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.465419 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-scripts\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.468254 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-scripts\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.468518 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-config-data\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.468611 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-fernet-keys\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.469514 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-credential-keys\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.481247 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2ndp\" (UniqueName: \"kubernetes.io/projected/949a611c-00dc-4dac-9068-0dc00cf79572-kube-api-access-x2ndp\") pod \"keystone-5c49cbbfd-wfwms\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:56 crc kubenswrapper[4812]: I0131 04:43:56.578281 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:57 crc kubenswrapper[4812]: I0131 04:43:57.016540 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone-5c49cbbfd-wfwms"] Jan 31 04:43:57 crc kubenswrapper[4812]: I0131 04:43:57.168094 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" event={"ID":"949a611c-00dc-4dac-9068-0dc00cf79572","Type":"ContainerStarted","Data":"c42d502265c3eb4d4196536705a7a9ea7e2b842ca955631a95c15552cf960050"} Jan 31 04:43:59 crc kubenswrapper[4812]: I0131 04:43:59.181492 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" event={"ID":"949a611c-00dc-4dac-9068-0dc00cf79572","Type":"ContainerStarted","Data":"453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab"} Jan 31 04:43:59 crc kubenswrapper[4812]: I0131 04:43:59.182660 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:43:59 crc kubenswrapper[4812]: I0131 04:43:59.204382 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" podStartSLOduration=3.204364119 podStartE2EDuration="3.204364119s" podCreationTimestamp="2026-01-31 04:43:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:43:59.200948826 +0000 UTC m=+1047.695970561" watchObservedRunningTime="2026-01-31 04:43:59.204364119 +0000 UTC m=+1047.699385784" Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.835561 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2"] Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.836820 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.838645 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-dockercfg-kfrnt" Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.838669 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"swift-operator-controller-manager-service-cert" Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.854031 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2"] Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.996533 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-apiservice-cert\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.996905 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w86fs\" (UniqueName: \"kubernetes.io/projected/0921e685-9db0-446d-9ed9-9ac2016fffc2-kube-api-access-w86fs\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:10 crc kubenswrapper[4812]: I0131 04:44:10.996937 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-webhook-cert\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.098230 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-w86fs\" (UniqueName: \"kubernetes.io/projected/0921e685-9db0-446d-9ed9-9ac2016fffc2-kube-api-access-w86fs\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.098308 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-webhook-cert\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.098367 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-apiservice-cert\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.106480 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-apiservice-cert\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.108881 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-webhook-cert\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.114736 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-w86fs\" (UniqueName: \"kubernetes.io/projected/0921e685-9db0-446d-9ed9-9ac2016fffc2-kube-api-access-w86fs\") pod \"swift-operator-controller-manager-6d8cb97c5-8h5w2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.203138 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.388759 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb"] Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.401759 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.407450 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-service-cert" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.407676 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"horizon-operator-controller-manager-dockercfg-s7wq5" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.409137 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb"] Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.504139 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/500efc8e-b639-4788-833f-3cb9189e1009-apiservice-cert\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.504244 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vbrwn\" (UniqueName: \"kubernetes.io/projected/500efc8e-b639-4788-833f-3cb9189e1009-kube-api-access-vbrwn\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.504415 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/500efc8e-b639-4788-833f-3cb9189e1009-webhook-cert\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.605611 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/500efc8e-b639-4788-833f-3cb9189e1009-webhook-cert\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.606554 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/500efc8e-b639-4788-833f-3cb9189e1009-apiservice-cert\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.606598 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vbrwn\" (UniqueName: \"kubernetes.io/projected/500efc8e-b639-4788-833f-3cb9189e1009-kube-api-access-vbrwn\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.610781 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/500efc8e-b639-4788-833f-3cb9189e1009-webhook-cert\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.613983 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/500efc8e-b639-4788-833f-3cb9189e1009-apiservice-cert\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.633052 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vbrwn\" (UniqueName: \"kubernetes.io/projected/500efc8e-b639-4788-833f-3cb9189e1009-kube-api-access-vbrwn\") pod \"horizon-operator-controller-manager-5d479c78f-kf6sb\" (UID: \"500efc8e-b639-4788-833f-3cb9189e1009\") " pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.717932 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2"] Jan 31 04:44:11 crc kubenswrapper[4812]: W0131 04:44:11.723268 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0921e685_9db0_446d_9ed9_9ac2016fffc2.slice/crio-1185970a060e074ec2015ba16c197498290f2e107ea4242d39f833a2c0abea3d WatchSource:0}: Error finding container 1185970a060e074ec2015ba16c197498290f2e107ea4242d39f833a2c0abea3d: Status 404 returned error can't find the container with id 1185970a060e074ec2015ba16c197498290f2e107ea4242d39f833a2c0abea3d Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.732362 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:11 crc kubenswrapper[4812]: I0131 04:44:11.741782 4812 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 04:44:12 crc kubenswrapper[4812]: I0131 04:44:12.290686 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" event={"ID":"0921e685-9db0-446d-9ed9-9ac2016fffc2","Type":"ContainerStarted","Data":"1185970a060e074ec2015ba16c197498290f2e107ea4242d39f833a2c0abea3d"} Jan 31 04:44:12 crc kubenswrapper[4812]: I0131 04:44:12.301922 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb"] Jan 31 04:44:12 crc kubenswrapper[4812]: W0131 04:44:12.308313 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod500efc8e_b639_4788_833f_3cb9189e1009.slice/crio-6452bc194d861b8521c16ab23387fd74b3106b02dd6ede6b307ad071d59a2d0e WatchSource:0}: Error finding container 6452bc194d861b8521c16ab23387fd74b3106b02dd6ede6b307ad071d59a2d0e: Status 404 returned error can't find the container with id 6452bc194d861b8521c16ab23387fd74b3106b02dd6ede6b307ad071d59a2d0e Jan 31 04:44:13 crc kubenswrapper[4812]: I0131 04:44:13.298995 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" event={"ID":"500efc8e-b639-4788-833f-3cb9189e1009","Type":"ContainerStarted","Data":"6452bc194d861b8521c16ab23387fd74b3106b02dd6ede6b307ad071d59a2d0e"} Jan 31 04:44:15 crc kubenswrapper[4812]: I0131 04:44:15.322786 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" event={"ID":"500efc8e-b639-4788-833f-3cb9189e1009","Type":"ContainerStarted","Data":"8aa9847ea8b077f0ca8d975e5f11a762233b9830db3cdf5e3ed482cc4dd9878d"} Jan 31 04:44:15 crc kubenswrapper[4812]: I0131 04:44:15.323473 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:17 crc kubenswrapper[4812]: I0131 04:44:17.341788 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" event={"ID":"0921e685-9db0-446d-9ed9-9ac2016fffc2","Type":"ContainerStarted","Data":"b9eca0ee3a1c8d0d0edf1334d72b1aa44595672978076e3111bc0136618e21d0"} Jan 31 04:44:17 crc kubenswrapper[4812]: I0131 04:44:17.343143 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:17 crc kubenswrapper[4812]: I0131 04:44:17.381375 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" podStartSLOduration=4.349727745 podStartE2EDuration="6.381349251s" podCreationTimestamp="2026-01-31 04:44:11 +0000 UTC" firstStartedPulling="2026-01-31 04:44:12.31172401 +0000 UTC m=+1060.806745685" lastFinishedPulling="2026-01-31 04:44:14.343345526 +0000 UTC m=+1062.838367191" observedRunningTime="2026-01-31 04:44:15.350254439 +0000 UTC m=+1063.845276114" watchObservedRunningTime="2026-01-31 04:44:17.381349251 +0000 UTC m=+1065.876370926" Jan 31 04:44:17 crc kubenswrapper[4812]: I0131 04:44:17.383095 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" podStartSLOduration=2.256977825 podStartE2EDuration="7.383086678s" podCreationTimestamp="2026-01-31 04:44:10 +0000 UTC" firstStartedPulling="2026-01-31 04:44:11.741187926 +0000 UTC m=+1060.236209631" lastFinishedPulling="2026-01-31 04:44:16.867296809 +0000 UTC m=+1065.362318484" observedRunningTime="2026-01-31 04:44:17.372076391 +0000 UTC m=+1065.867098096" watchObservedRunningTime="2026-01-31 04:44:17.383086678 +0000 UTC m=+1065.878108343" Jan 31 04:44:21 crc kubenswrapper[4812]: I0131 04:44:21.739172 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/horizon-operator-controller-manager-5d479c78f-kf6sb" Jan 31 04:44:28 crc kubenswrapper[4812]: I0131 04:44:28.003776 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:44:31 crc kubenswrapper[4812]: I0131 04:44:31.211581 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.572242 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-index-gqrtf"] Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.573262 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.575484 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-index-dockercfg-z6kj8" Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.587821 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-gqrtf"] Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.667417 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhxdz\" (UniqueName: \"kubernetes.io/projected/a84ac0d3-4f66-44f0-8566-37dd9a31bb66-kube-api-access-mhxdz\") pod \"glance-operator-index-gqrtf\" (UID: \"a84ac0d3-4f66-44f0-8566-37dd9a31bb66\") " pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.768316 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhxdz\" (UniqueName: \"kubernetes.io/projected/a84ac0d3-4f66-44f0-8566-37dd9a31bb66-kube-api-access-mhxdz\") pod \"glance-operator-index-gqrtf\" (UID: \"a84ac0d3-4f66-44f0-8566-37dd9a31bb66\") " pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.791536 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhxdz\" (UniqueName: \"kubernetes.io/projected/a84ac0d3-4f66-44f0-8566-37dd9a31bb66-kube-api-access-mhxdz\") pod \"glance-operator-index-gqrtf\" (UID: \"a84ac0d3-4f66-44f0-8566-37dd9a31bb66\") " pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:33 crc kubenswrapper[4812]: I0131 04:44:33.954362 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.227642 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.231812 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.233397 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-conf" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.233887 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-storage-config-data" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.234178 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-files" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.241917 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-swift-dockercfg-bqht4" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.270654 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.378531 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-lock\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.378579 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-cache\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.378618 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.378768 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbxck\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-kube-api-access-wbxck\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.378815 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.418077 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-index-gqrtf"] Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.476797 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-gqrtf" event={"ID":"a84ac0d3-4f66-44f0-8566-37dd9a31bb66","Type":"ContainerStarted","Data":"7cebf7d9a0d92404c987af65ee7395672a289702e6dc15a351b27e757fc3bce1"} Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.480579 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wbxck\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-kube-api-access-wbxck\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.480735 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.480883 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-lock\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.480994 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-cache\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: E0131 04:44:34.480914 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:34 crc kubenswrapper[4812]: E0131 04:44:34.481094 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 04:44:34 crc kubenswrapper[4812]: E0131 04:44:34.481175 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift podName:bab986d4-81ba-4f72-a3fc-3b0cdb004c6e nodeName:}" failed. No retries permitted until 2026-01-31 04:44:34.981151617 +0000 UTC m=+1083.476173282 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift") pod "swift-storage-0" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e") : configmap "swift-ring-files" not found Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.481277 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.481419 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-lock\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.481468 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-cache\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.481714 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") device mount path \"/mnt/openstack/pv15\"" pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.504282 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.509357 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbxck\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-kube-api-access-wbxck\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.762079 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-mw6hz"] Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.764257 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.772009 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-mw6hz"] Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.775376 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-scripts" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.775433 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"swift-proxy-config-data" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.782613 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"swift-ring-config-data" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.886021 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-ring-data-devices\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.886362 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-scripts\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.886393 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z85nc\" (UniqueName: \"kubernetes.io/projected/55388d4d-80ca-4221-9a99-37c116dda83e-kube-api-access-z85nc\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.886476 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55388d4d-80ca-4221-9a99-37c116dda83e-etc-swift\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.886503 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-swiftconf\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.886529 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-dispersionconf\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.988797 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-ring-data-devices\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.988877 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-scripts\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.988899 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-z85nc\" (UniqueName: \"kubernetes.io/projected/55388d4d-80ca-4221-9a99-37c116dda83e-kube-api-access-z85nc\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.988941 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55388d4d-80ca-4221-9a99-37c116dda83e-etc-swift\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.988958 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-swiftconf\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.988978 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-dispersionconf\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.989008 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:34 crc kubenswrapper[4812]: E0131 04:44:34.989145 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:34 crc kubenswrapper[4812]: E0131 04:44:34.989158 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 04:44:34 crc kubenswrapper[4812]: E0131 04:44:34.989201 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift podName:bab986d4-81ba-4f72-a3fc-3b0cdb004c6e nodeName:}" failed. No retries permitted until 2026-01-31 04:44:35.989186227 +0000 UTC m=+1084.484207892 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift") pod "swift-storage-0" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e") : configmap "swift-ring-files" not found Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.989646 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-scripts\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.989675 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-ring-data-devices\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.989912 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55388d4d-80ca-4221-9a99-37c116dda83e-etc-swift\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.993568 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-dispersionconf\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:34 crc kubenswrapper[4812]: I0131 04:44:34.993629 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-swiftconf\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:35 crc kubenswrapper[4812]: I0131 04:44:35.006540 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-z85nc\" (UniqueName: \"kubernetes.io/projected/55388d4d-80ca-4221-9a99-37c116dda83e-kube-api-access-z85nc\") pod \"swift-ring-rebalance-mw6hz\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:35 crc kubenswrapper[4812]: I0131 04:44:35.084586 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.030703 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.031210 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.031227 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.031279 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift podName:bab986d4-81ba-4f72-a3fc-3b0cdb004c6e nodeName:}" failed. No retries permitted until 2026-01-31 04:44:38.031260738 +0000 UTC m=+1086.526282413 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift") pod "swift-storage-0" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e") : configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.165657 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4"] Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.167739 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.178263 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4"] Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.314120 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-mw6hz"] Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.335014 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-run-httpd\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.335246 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-config-data\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.335309 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.335387 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-log-httpd\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.335442 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-92f4g\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-kube-api-access-92f4g\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.436938 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-log-httpd\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.437023 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-92f4g\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-kube-api-access-92f4g\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.437095 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-run-httpd\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.437186 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-config-data\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.437221 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.437370 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.437391 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4: configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.437426 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-log-httpd\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.437444 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift podName:2b92b5ab-106c-4ea9-a9f3-461ec016b1ae nodeName:}" failed. No retries permitted until 2026-01-31 04:44:36.937426503 +0000 UTC m=+1085.432448178 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift") pod "swift-proxy-6d699db77c-g7fm4" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae") : configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.437599 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-run-httpd\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.443617 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-config-data\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.461875 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-92f4g\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-kube-api-access-92f4g\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: W0131 04:44:36.516467 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod55388d4d_80ca_4221_9a99_37c116dda83e.slice/crio-f4f6495a70f712e9faab0ea02992233c33bef2acc11bee9106736226522e46b8 WatchSource:0}: Error finding container f4f6495a70f712e9faab0ea02992233c33bef2acc11bee9106736226522e46b8: Status 404 returned error can't find the container with id f4f6495a70f712e9faab0ea02992233c33bef2acc11bee9106736226522e46b8 Jan 31 04:44:36 crc kubenswrapper[4812]: I0131 04:44:36.943571 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.943657 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.944173 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4: configmap "swift-ring-files" not found Jan 31 04:44:36 crc kubenswrapper[4812]: E0131 04:44:36.944237 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift podName:2b92b5ab-106c-4ea9-a9f3-461ec016b1ae nodeName:}" failed. No retries permitted until 2026-01-31 04:44:37.944217639 +0000 UTC m=+1086.439239304 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift") pod "swift-proxy-6d699db77c-g7fm4" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae") : configmap "swift-ring-files" not found Jan 31 04:44:37 crc kubenswrapper[4812]: I0131 04:44:37.497472 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" event={"ID":"55388d4d-80ca-4221-9a99-37c116dda83e","Type":"ContainerStarted","Data":"f4f6495a70f712e9faab0ea02992233c33bef2acc11bee9106736226522e46b8"} Jan 31 04:44:37 crc kubenswrapper[4812]: I0131 04:44:37.955169 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:37 crc kubenswrapper[4812]: E0131 04:44:37.955328 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:37 crc kubenswrapper[4812]: E0131 04:44:37.955617 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4: configmap "swift-ring-files" not found Jan 31 04:44:37 crc kubenswrapper[4812]: E0131 04:44:37.955708 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift podName:2b92b5ab-106c-4ea9-a9f3-461ec016b1ae nodeName:}" failed. No retries permitted until 2026-01-31 04:44:39.955682025 +0000 UTC m=+1088.450703690 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift") pod "swift-proxy-6d699db77c-g7fm4" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae") : configmap "swift-ring-files" not found Jan 31 04:44:38 crc kubenswrapper[4812]: I0131 04:44:38.056743 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:38 crc kubenswrapper[4812]: E0131 04:44:38.057026 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:38 crc kubenswrapper[4812]: E0131 04:44:38.057044 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 04:44:38 crc kubenswrapper[4812]: E0131 04:44:38.057093 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift podName:bab986d4-81ba-4f72-a3fc-3b0cdb004c6e nodeName:}" failed. No retries permitted until 2026-01-31 04:44:42.057076067 +0000 UTC m=+1090.552097732 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift") pod "swift-storage-0" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e") : configmap "swift-ring-files" not found Jan 31 04:44:38 crc kubenswrapper[4812]: I0131 04:44:38.506034 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-gqrtf" event={"ID":"a84ac0d3-4f66-44f0-8566-37dd9a31bb66","Type":"ContainerStarted","Data":"7ab6ed6bbc496845b23c712bc1de59a49f4d382adcab786248a95bb9c417d985"} Jan 31 04:44:38 crc kubenswrapper[4812]: I0131 04:44:38.526550 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-index-gqrtf" podStartSLOduration=2.173682588 podStartE2EDuration="5.526526627s" podCreationTimestamp="2026-01-31 04:44:33 +0000 UTC" firstStartedPulling="2026-01-31 04:44:34.41783076 +0000 UTC m=+1082.912852425" lastFinishedPulling="2026-01-31 04:44:37.770674799 +0000 UTC m=+1086.265696464" observedRunningTime="2026-01-31 04:44:38.521366728 +0000 UTC m=+1087.016388393" watchObservedRunningTime="2026-01-31 04:44:38.526526627 +0000 UTC m=+1087.021548292" Jan 31 04:44:39 crc kubenswrapper[4812]: I0131 04:44:39.985728 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:39 crc kubenswrapper[4812]: E0131 04:44:39.986758 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:39 crc kubenswrapper[4812]: E0131 04:44:39.986784 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4: configmap "swift-ring-files" not found Jan 31 04:44:39 crc kubenswrapper[4812]: E0131 04:44:39.986878 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift podName:2b92b5ab-106c-4ea9-a9f3-461ec016b1ae nodeName:}" failed. No retries permitted until 2026-01-31 04:44:43.986859599 +0000 UTC m=+1092.481881264 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift") pod "swift-proxy-6d699db77c-g7fm4" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae") : configmap "swift-ring-files" not found Jan 31 04:44:42 crc kubenswrapper[4812]: I0131 04:44:42.119318 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:42 crc kubenswrapper[4812]: E0131 04:44:42.119504 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:42 crc kubenswrapper[4812]: E0131 04:44:42.119779 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 04:44:42 crc kubenswrapper[4812]: E0131 04:44:42.119880 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift podName:bab986d4-81ba-4f72-a3fc-3b0cdb004c6e nodeName:}" failed. No retries permitted until 2026-01-31 04:44:50.119855966 +0000 UTC m=+1098.614877631 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift") pod "swift-storage-0" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e") : configmap "swift-ring-files" not found Jan 31 04:44:43 crc kubenswrapper[4812]: I0131 04:44:43.955144 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:43 crc kubenswrapper[4812]: I0131 04:44:43.955793 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:43 crc kubenswrapper[4812]: I0131 04:44:43.978042 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:44 crc kubenswrapper[4812]: I0131 04:44:44.054266 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:44 crc kubenswrapper[4812]: E0131 04:44:44.054454 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:44 crc kubenswrapper[4812]: E0131 04:44:44.054514 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4: configmap "swift-ring-files" not found Jan 31 04:44:44 crc kubenswrapper[4812]: E0131 04:44:44.054680 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift podName:2b92b5ab-106c-4ea9-a9f3-461ec016b1ae nodeName:}" failed. No retries permitted until 2026-01-31 04:44:52.054661503 +0000 UTC m=+1100.549683168 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift") pod "swift-proxy-6d699db77c-g7fm4" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae") : configmap "swift-ring-files" not found Jan 31 04:44:44 crc kubenswrapper[4812]: I0131 04:44:44.554067 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" event={"ID":"55388d4d-80ca-4221-9a99-37c116dda83e","Type":"ContainerStarted","Data":"206d46e9bde7ae9d382efa060b105b383b2e08bc746f62d96eb4203d8118d58f"} Jan 31 04:44:44 crc kubenswrapper[4812]: I0131 04:44:44.581530 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" podStartSLOduration=3.387096553 podStartE2EDuration="10.58151272s" podCreationTimestamp="2026-01-31 04:44:34 +0000 UTC" firstStartedPulling="2026-01-31 04:44:36.520319086 +0000 UTC m=+1085.015340751" lastFinishedPulling="2026-01-31 04:44:43.714735253 +0000 UTC m=+1092.209756918" observedRunningTime="2026-01-31 04:44:44.578519209 +0000 UTC m=+1093.073540874" watchObservedRunningTime="2026-01-31 04:44:44.58151272 +0000 UTC m=+1093.076534385" Jan 31 04:44:44 crc kubenswrapper[4812]: I0131 04:44:44.589113 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.419167 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc"] Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.421896 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.424341 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"default-dockercfg-sklpv" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.430588 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc"] Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.507904 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-bundle\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.508156 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-util\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.508398 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sp98\" (UniqueName: \"kubernetes.io/projected/1d4dc508-c84d-4128-a69c-9bddda59bb41-kube-api-access-6sp98\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.610106 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-util\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.610217 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6sp98\" (UniqueName: \"kubernetes.io/projected/1d4dc508-c84d-4128-a69c-9bddda59bb41-kube-api-access-6sp98\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.610260 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-bundle\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.610771 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-util\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.610771 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-bundle\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.641608 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sp98\" (UniqueName: \"kubernetes.io/projected/1d4dc508-c84d-4128-a69c-9bddda59bb41-kube-api-access-6sp98\") pod \"340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:47 crc kubenswrapper[4812]: I0131 04:44:47.738233 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:48 crc kubenswrapper[4812]: I0131 04:44:48.231722 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc"] Jan 31 04:44:48 crc kubenswrapper[4812]: W0131 04:44:48.241778 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1d4dc508_c84d_4128_a69c_9bddda59bb41.slice/crio-c1089a22ce5fd72bb3a1e6d3140b894568da0a67d8b8ffc9a162120bfd276de5 WatchSource:0}: Error finding container c1089a22ce5fd72bb3a1e6d3140b894568da0a67d8b8ffc9a162120bfd276de5: Status 404 returned error can't find the container with id c1089a22ce5fd72bb3a1e6d3140b894568da0a67d8b8ffc9a162120bfd276de5 Jan 31 04:44:48 crc kubenswrapper[4812]: I0131 04:44:48.586171 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerID="b2ba90f9040195ee909aad60a2d2c43363e3a2c7878447cc61bd827ab1c984f0" exitCode=0 Jan 31 04:44:48 crc kubenswrapper[4812]: I0131 04:44:48.586243 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" event={"ID":"1d4dc508-c84d-4128-a69c-9bddda59bb41","Type":"ContainerDied","Data":"b2ba90f9040195ee909aad60a2d2c43363e3a2c7878447cc61bd827ab1c984f0"} Jan 31 04:44:48 crc kubenswrapper[4812]: I0131 04:44:48.586338 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" event={"ID":"1d4dc508-c84d-4128-a69c-9bddda59bb41","Type":"ContainerStarted","Data":"c1089a22ce5fd72bb3a1e6d3140b894568da0a67d8b8ffc9a162120bfd276de5"} Jan 31 04:44:49 crc kubenswrapper[4812]: I0131 04:44:49.596763 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerID="f4f45b62a2ea5a5922faf1d10c9506cee5231a9f26bc50e674d0f0a7f8b7dec5" exitCode=0 Jan 31 04:44:49 crc kubenswrapper[4812]: I0131 04:44:49.596834 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" event={"ID":"1d4dc508-c84d-4128-a69c-9bddda59bb41","Type":"ContainerDied","Data":"f4f45b62a2ea5a5922faf1d10c9506cee5231a9f26bc50e674d0f0a7f8b7dec5"} Jan 31 04:44:50 crc kubenswrapper[4812]: I0131 04:44:50.158880 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:44:50 crc kubenswrapper[4812]: E0131 04:44:50.159102 4812 projected.go:288] Couldn't get configMap glance-kuttl-tests/swift-ring-files: configmap "swift-ring-files" not found Jan 31 04:44:50 crc kubenswrapper[4812]: E0131 04:44:50.159360 4812 projected.go:194] Error preparing data for projected volume etc-swift for pod glance-kuttl-tests/swift-storage-0: configmap "swift-ring-files" not found Jan 31 04:44:50 crc kubenswrapper[4812]: E0131 04:44:50.159434 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift podName:bab986d4-81ba-4f72-a3fc-3b0cdb004c6e nodeName:}" failed. No retries permitted until 2026-01-31 04:45:06.159410327 +0000 UTC m=+1114.654432072 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "etc-swift" (UniqueName: "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift") pod "swift-storage-0" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e") : configmap "swift-ring-files" not found Jan 31 04:44:50 crc kubenswrapper[4812]: I0131 04:44:50.608566 4812 generic.go:334] "Generic (PLEG): container finished" podID="55388d4d-80ca-4221-9a99-37c116dda83e" containerID="206d46e9bde7ae9d382efa060b105b383b2e08bc746f62d96eb4203d8118d58f" exitCode=0 Jan 31 04:44:50 crc kubenswrapper[4812]: I0131 04:44:50.608653 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" event={"ID":"55388d4d-80ca-4221-9a99-37c116dda83e","Type":"ContainerDied","Data":"206d46e9bde7ae9d382efa060b105b383b2e08bc746f62d96eb4203d8118d58f"} Jan 31 04:44:50 crc kubenswrapper[4812]: I0131 04:44:50.615975 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerID="948baaf4e48b4f6d1f1c630de909d1920e5f31224d8aec5922c40658b33b3f19" exitCode=0 Jan 31 04:44:50 crc kubenswrapper[4812]: I0131 04:44:50.616030 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" event={"ID":"1d4dc508-c84d-4128-a69c-9bddda59bb41","Type":"ContainerDied","Data":"948baaf4e48b4f6d1f1c630de909d1920e5f31224d8aec5922c40658b33b3f19"} Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.027513 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.033988 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.094697 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.103981 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"swift-proxy-6d699db77c-g7fm4\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.195998 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-bundle\") pod \"1d4dc508-c84d-4128-a69c-9bddda59bb41\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196049 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-swiftconf\") pod \"55388d4d-80ca-4221-9a99-37c116dda83e\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196080 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6sp98\" (UniqueName: \"kubernetes.io/projected/1d4dc508-c84d-4128-a69c-9bddda59bb41-kube-api-access-6sp98\") pod \"1d4dc508-c84d-4128-a69c-9bddda59bb41\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196123 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-ring-data-devices\") pod \"55388d4d-80ca-4221-9a99-37c116dda83e\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196148 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-util\") pod \"1d4dc508-c84d-4128-a69c-9bddda59bb41\" (UID: \"1d4dc508-c84d-4128-a69c-9bddda59bb41\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196177 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55388d4d-80ca-4221-9a99-37c116dda83e-etc-swift\") pod \"55388d4d-80ca-4221-9a99-37c116dda83e\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196195 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-dispersionconf\") pod \"55388d4d-80ca-4221-9a99-37c116dda83e\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196222 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-scripts\") pod \"55388d4d-80ca-4221-9a99-37c116dda83e\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196256 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z85nc\" (UniqueName: \"kubernetes.io/projected/55388d4d-80ca-4221-9a99-37c116dda83e-kube-api-access-z85nc\") pod \"55388d4d-80ca-4221-9a99-37c116dda83e\" (UID: \"55388d4d-80ca-4221-9a99-37c116dda83e\") " Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196918 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-bundle" (OuterVolumeSpecName: "bundle") pod "1d4dc508-c84d-4128-a69c-9bddda59bb41" (UID: "1d4dc508-c84d-4128-a69c-9bddda59bb41"). InnerVolumeSpecName "bundle". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.196958 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-ring-data-devices" (OuterVolumeSpecName: "ring-data-devices") pod "55388d4d-80ca-4221-9a99-37c116dda83e" (UID: "55388d4d-80ca-4221-9a99-37c116dda83e"). InnerVolumeSpecName "ring-data-devices". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.198490 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/55388d4d-80ca-4221-9a99-37c116dda83e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "55388d4d-80ca-4221-9a99-37c116dda83e" (UID: "55388d4d-80ca-4221-9a99-37c116dda83e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.200629 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d4dc508-c84d-4128-a69c-9bddda59bb41-kube-api-access-6sp98" (OuterVolumeSpecName: "kube-api-access-6sp98") pod "1d4dc508-c84d-4128-a69c-9bddda59bb41" (UID: "1d4dc508-c84d-4128-a69c-9bddda59bb41"). InnerVolumeSpecName "kube-api-access-6sp98". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.201299 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/55388d4d-80ca-4221-9a99-37c116dda83e-kube-api-access-z85nc" (OuterVolumeSpecName: "kube-api-access-z85nc") pod "55388d4d-80ca-4221-9a99-37c116dda83e" (UID: "55388d4d-80ca-4221-9a99-37c116dda83e"). InnerVolumeSpecName "kube-api-access-z85nc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.202144 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-dispersionconf" (OuterVolumeSpecName: "dispersionconf") pod "55388d4d-80ca-4221-9a99-37c116dda83e" (UID: "55388d4d-80ca-4221-9a99-37c116dda83e"). InnerVolumeSpecName "dispersionconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.211223 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-util" (OuterVolumeSpecName: "util") pod "1d4dc508-c84d-4128-a69c-9bddda59bb41" (UID: "1d4dc508-c84d-4128-a69c-9bddda59bb41"). InnerVolumeSpecName "util". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.212786 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-scripts" (OuterVolumeSpecName: "scripts") pod "55388d4d-80ca-4221-9a99-37c116dda83e" (UID: "55388d4d-80ca-4221-9a99-37c116dda83e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.220945 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-swiftconf" (OuterVolumeSpecName: "swiftconf") pod "55388d4d-80ca-4221-9a99-37c116dda83e" (UID: "55388d4d-80ca-4221-9a99-37c116dda83e"). InnerVolumeSpecName "swiftconf". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298376 4812 reconciler_common.go:293] "Volume detached for volume \"bundle\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298423 4812 reconciler_common.go:293] "Volume detached for volume \"swiftconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-swiftconf\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298437 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6sp98\" (UniqueName: \"kubernetes.io/projected/1d4dc508-c84d-4128-a69c-9bddda59bb41-kube-api-access-6sp98\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298449 4812 reconciler_common.go:293] "Volume detached for volume \"ring-data-devices\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-ring-data-devices\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298460 4812 reconciler_common.go:293] "Volume detached for volume \"util\" (UniqueName: \"kubernetes.io/empty-dir/1d4dc508-c84d-4128-a69c-9bddda59bb41-util\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298470 4812 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/empty-dir/55388d4d-80ca-4221-9a99-37c116dda83e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298480 4812 reconciler_common.go:293] "Volume detached for volume \"dispersionconf\" (UniqueName: \"kubernetes.io/secret/55388d4d-80ca-4221-9a99-37c116dda83e-dispersionconf\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298491 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/configmap/55388d4d-80ca-4221-9a99-37c116dda83e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.298501 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-z85nc\" (UniqueName: \"kubernetes.io/projected/55388d4d-80ca-4221-9a99-37c116dda83e-kube-api-access-z85nc\") on node \"crc\" DevicePath \"\"" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.387299 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.631962 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" event={"ID":"55388d4d-80ca-4221-9a99-37c116dda83e","Type":"ContainerDied","Data":"f4f6495a70f712e9faab0ea02992233c33bef2acc11bee9106736226522e46b8"} Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.632375 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f4f6495a70f712e9faab0ea02992233c33bef2acc11bee9106736226522e46b8" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.631993 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-ring-rebalance-mw6hz" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.634818 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" event={"ID":"1d4dc508-c84d-4128-a69c-9bddda59bb41","Type":"ContainerDied","Data":"c1089a22ce5fd72bb3a1e6d3140b894568da0a67d8b8ffc9a162120bfd276de5"} Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.634854 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1089a22ce5fd72bb3a1e6d3140b894568da0a67d8b8ffc9a162120bfd276de5" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.634972 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc" Jan 31 04:44:52 crc kubenswrapper[4812]: I0131 04:44:52.848684 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4"] Jan 31 04:44:52 crc kubenswrapper[4812]: W0131 04:44:52.862435 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2b92b5ab_106c_4ea9_a9f3_461ec016b1ae.slice/crio-5e4e3ef6b9f40718d69513cc65dc753d997aa99542746b2a201399f5c66869cc WatchSource:0}: Error finding container 5e4e3ef6b9f40718d69513cc65dc753d997aa99542746b2a201399f5c66869cc: Status 404 returned error can't find the container with id 5e4e3ef6b9f40718d69513cc65dc753d997aa99542746b2a201399f5c66869cc Jan 31 04:44:53 crc kubenswrapper[4812]: I0131 04:44:53.648246 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" event={"ID":"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae","Type":"ContainerStarted","Data":"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07"} Jan 31 04:44:53 crc kubenswrapper[4812]: I0131 04:44:53.648774 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" event={"ID":"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae","Type":"ContainerStarted","Data":"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0"} Jan 31 04:44:53 crc kubenswrapper[4812]: I0131 04:44:53.648807 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" event={"ID":"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae","Type":"ContainerStarted","Data":"5e4e3ef6b9f40718d69513cc65dc753d997aa99542746b2a201399f5c66869cc"} Jan 31 04:44:54 crc kubenswrapper[4812]: I0131 04:44:54.653257 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:54 crc kubenswrapper[4812]: I0131 04:44:54.654442 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:44:54 crc kubenswrapper[4812]: I0131 04:44:54.675049 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" podStartSLOduration=18.675029685 podStartE2EDuration="18.675029685s" podCreationTimestamp="2026-01-31 04:44:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:44:54.671013797 +0000 UTC m=+1103.166035482" watchObservedRunningTime="2026-01-31 04:44:54.675029685 +0000 UTC m=+1103.170051350" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.162433 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd"] Jan 31 04:45:00 crc kubenswrapper[4812]: E0131 04:45:00.163217 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="extract" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163232 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="extract" Jan 31 04:45:00 crc kubenswrapper[4812]: E0131 04:45:00.163248 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="55388d4d-80ca-4221-9a99-37c116dda83e" containerName="swift-ring-rebalance" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163256 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="55388d4d-80ca-4221-9a99-37c116dda83e" containerName="swift-ring-rebalance" Jan 31 04:45:00 crc kubenswrapper[4812]: E0131 04:45:00.163272 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="pull" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163282 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="pull" Jan 31 04:45:00 crc kubenswrapper[4812]: E0131 04:45:00.163302 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="util" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163309 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="util" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163450 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" containerName="extract" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163466 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="55388d4d-80ca-4221-9a99-37c116dda83e" containerName="swift-ring-rebalance" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.163994 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.166041 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.166416 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.184315 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd"] Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.343733 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/999fd628-3c3e-440e-b49e-a8e1a0754c8b-secret-volume\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.343787 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-blg4p\" (UniqueName: \"kubernetes.io/projected/999fd628-3c3e-440e-b49e-a8e1a0754c8b-kube-api-access-blg4p\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.344154 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/999fd628-3c3e-440e-b49e-a8e1a0754c8b-config-volume\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.445701 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/999fd628-3c3e-440e-b49e-a8e1a0754c8b-config-volume\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.445759 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/999fd628-3c3e-440e-b49e-a8e1a0754c8b-secret-volume\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.445808 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-blg4p\" (UniqueName: \"kubernetes.io/projected/999fd628-3c3e-440e-b49e-a8e1a0754c8b-kube-api-access-blg4p\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.446661 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/999fd628-3c3e-440e-b49e-a8e1a0754c8b-config-volume\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.451167 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/999fd628-3c3e-440e-b49e-a8e1a0754c8b-secret-volume\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.465216 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-blg4p\" (UniqueName: \"kubernetes.io/projected/999fd628-3c3e-440e-b49e-a8e1a0754c8b-kube-api-access-blg4p\") pod \"collect-profiles-29497245-hwxrd\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.485871 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:00 crc kubenswrapper[4812]: I0131 04:45:00.701229 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd"] Jan 31 04:45:00 crc kubenswrapper[4812]: W0131 04:45:00.706967 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod999fd628_3c3e_440e_b49e_a8e1a0754c8b.slice/crio-6d7b9032fb5f7a07df9afc206c930161bf4c688f173a02b05d78fa35f562896b WatchSource:0}: Error finding container 6d7b9032fb5f7a07df9afc206c930161bf4c688f173a02b05d78fa35f562896b: Status 404 returned error can't find the container with id 6d7b9032fb5f7a07df9afc206c930161bf4c688f173a02b05d78fa35f562896b Jan 31 04:45:01 crc kubenswrapper[4812]: I0131 04:45:01.699470 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" event={"ID":"999fd628-3c3e-440e-b49e-a8e1a0754c8b","Type":"ContainerStarted","Data":"6d7b9032fb5f7a07df9afc206c930161bf4c688f173a02b05d78fa35f562896b"} Jan 31 04:45:02 crc kubenswrapper[4812]: I0131 04:45:02.390595 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:45:02 crc kubenswrapper[4812]: I0131 04:45:02.403105 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:45:02 crc kubenswrapper[4812]: I0131 04:45:02.707210 4812 generic.go:334] "Generic (PLEG): container finished" podID="999fd628-3c3e-440e-b49e-a8e1a0754c8b" containerID="b2f476216d04cde57f4fa8abd77ccc45fb9a051d3837d163e05557846b20a431" exitCode=0 Jan 31 04:45:02 crc kubenswrapper[4812]: I0131 04:45:02.707313 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" event={"ID":"999fd628-3c3e-440e-b49e-a8e1a0754c8b","Type":"ContainerDied","Data":"b2f476216d04cde57f4fa8abd77ccc45fb9a051d3837d163e05557846b20a431"} Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.018278 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.201158 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-blg4p\" (UniqueName: \"kubernetes.io/projected/999fd628-3c3e-440e-b49e-a8e1a0754c8b-kube-api-access-blg4p\") pod \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.201249 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/999fd628-3c3e-440e-b49e-a8e1a0754c8b-config-volume\") pod \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.201295 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/999fd628-3c3e-440e-b49e-a8e1a0754c8b-secret-volume\") pod \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\" (UID: \"999fd628-3c3e-440e-b49e-a8e1a0754c8b\") " Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.202139 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/999fd628-3c3e-440e-b49e-a8e1a0754c8b-config-volume" (OuterVolumeSpecName: "config-volume") pod "999fd628-3c3e-440e-b49e-a8e1a0754c8b" (UID: "999fd628-3c3e-440e-b49e-a8e1a0754c8b"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.207303 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/999fd628-3c3e-440e-b49e-a8e1a0754c8b-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "999fd628-3c3e-440e-b49e-a8e1a0754c8b" (UID: "999fd628-3c3e-440e-b49e-a8e1a0754c8b"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.207395 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/999fd628-3c3e-440e-b49e-a8e1a0754c8b-kube-api-access-blg4p" (OuterVolumeSpecName: "kube-api-access-blg4p") pod "999fd628-3c3e-440e-b49e-a8e1a0754c8b" (UID: "999fd628-3c3e-440e-b49e-a8e1a0754c8b"). InnerVolumeSpecName "kube-api-access-blg4p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.311678 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-blg4p\" (UniqueName: \"kubernetes.io/projected/999fd628-3c3e-440e-b49e-a8e1a0754c8b-kube-api-access-blg4p\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.311725 4812 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/999fd628-3c3e-440e-b49e-a8e1a0754c8b-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.311738 4812 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/999fd628-3c3e-440e-b49e-a8e1a0754c8b-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.326337 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6"] Jan 31 04:45:04 crc kubenswrapper[4812]: E0131 04:45:04.326669 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="999fd628-3c3e-440e-b49e-a8e1a0754c8b" containerName="collect-profiles" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.326692 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="999fd628-3c3e-440e-b49e-a8e1a0754c8b" containerName="collect-profiles" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.326914 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="999fd628-3c3e-440e-b49e-a8e1a0754c8b" containerName="collect-profiles" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.327451 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.333060 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-service-cert" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.333315 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openstack-operators"/"glance-operator-controller-manager-dockercfg-rc856" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.351461 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6"] Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.514330 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-apiservice-cert\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.514378 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-webhook-cert\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.514463 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vmqtk\" (UniqueName: \"kubernetes.io/projected/99392b95-2df1-4600-afd1-c6a4f4d47e5c-kube-api-access-vmqtk\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.615991 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vmqtk\" (UniqueName: \"kubernetes.io/projected/99392b95-2df1-4600-afd1-c6a4f4d47e5c-kube-api-access-vmqtk\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.616091 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-apiservice-cert\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.616125 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-webhook-cert\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.620971 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-webhook-cert\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.623379 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-apiservice-cert\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.637919 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vmqtk\" (UniqueName: \"kubernetes.io/projected/99392b95-2df1-4600-afd1-c6a4f4d47e5c-kube-api-access-vmqtk\") pod \"glance-operator-controller-manager-7d59b884bb-tn2x6\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.649453 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.723199 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" event={"ID":"999fd628-3c3e-440e-b49e-a8e1a0754c8b","Type":"ContainerDied","Data":"6d7b9032fb5f7a07df9afc206c930161bf4c688f173a02b05d78fa35f562896b"} Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.723665 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="6d7b9032fb5f7a07df9afc206c930161bf4c688f173a02b05d78fa35f562896b" Jan 31 04:45:04 crc kubenswrapper[4812]: I0131 04:45:04.723813 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497245-hwxrd" Jan 31 04:45:05 crc kubenswrapper[4812]: I0131 04:45:05.028980 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6"] Jan 31 04:45:05 crc kubenswrapper[4812]: W0131 04:45:05.032319 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod99392b95_2df1_4600_afd1_c6a4f4d47e5c.slice/crio-7d61e9358953928204894cb921dbaf815bc0ba5709aa7d5d9692bc6f9d718363 WatchSource:0}: Error finding container 7d61e9358953928204894cb921dbaf815bc0ba5709aa7d5d9692bc6f9d718363: Status 404 returned error can't find the container with id 7d61e9358953928204894cb921dbaf815bc0ba5709aa7d5d9692bc6f9d718363 Jan 31 04:45:05 crc kubenswrapper[4812]: I0131 04:45:05.736026 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" event={"ID":"99392b95-2df1-4600-afd1-c6a4f4d47e5c","Type":"ContainerStarted","Data":"7d61e9358953928204894cb921dbaf815bc0ba5709aa7d5d9692bc6f9d718363"} Jan 31 04:45:06 crc kubenswrapper[4812]: I0131 04:45:06.246242 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:45:06 crc kubenswrapper[4812]: I0131 04:45:06.254895 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"swift-storage-0\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:45:06 crc kubenswrapper[4812]: I0131 04:45:06.349599 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:45:06 crc kubenswrapper[4812]: I0131 04:45:06.829057 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 04:45:06 crc kubenswrapper[4812]: W0131 04:45:06.836215 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbab986d4_81ba_4f72_a3fc_3b0cdb004c6e.slice/crio-5db0c7dca63414ed0916fd410bc4bb5f962aa194724d39e7b495d953d7230c31 WatchSource:0}: Error finding container 5db0c7dca63414ed0916fd410bc4bb5f962aa194724d39e7b495d953d7230c31: Status 404 returned error can't find the container with id 5db0c7dca63414ed0916fd410bc4bb5f962aa194724d39e7b495d953d7230c31 Jan 31 04:45:07 crc kubenswrapper[4812]: I0131 04:45:07.759628 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"5db0c7dca63414ed0916fd410bc4bb5f962aa194724d39e7b495d953d7230c31"} Jan 31 04:45:10 crc kubenswrapper[4812]: I0131 04:45:10.789021 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0"} Jan 31 04:45:10 crc kubenswrapper[4812]: I0131 04:45:10.789358 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd"} Jan 31 04:45:10 crc kubenswrapper[4812]: I0131 04:45:10.793568 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" event={"ID":"99392b95-2df1-4600-afd1-c6a4f4d47e5c","Type":"ContainerStarted","Data":"e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c"} Jan 31 04:45:10 crc kubenswrapper[4812]: I0131 04:45:10.793910 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:10 crc kubenswrapper[4812]: I0131 04:45:10.834181 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" podStartSLOduration=1.519551903 podStartE2EDuration="6.834157503s" podCreationTimestamp="2026-01-31 04:45:04 +0000 UTC" firstStartedPulling="2026-01-31 04:45:05.035731198 +0000 UTC m=+1113.530752863" lastFinishedPulling="2026-01-31 04:45:10.350336798 +0000 UTC m=+1118.845358463" observedRunningTime="2026-01-31 04:45:10.824307768 +0000 UTC m=+1119.319329443" watchObservedRunningTime="2026-01-31 04:45:10.834157503 +0000 UTC m=+1119.329179178" Jan 31 04:45:11 crc kubenswrapper[4812]: I0131 04:45:11.812384 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a"} Jan 31 04:45:11 crc kubenswrapper[4812]: I0131 04:45:11.812809 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85"} Jan 31 04:45:13 crc kubenswrapper[4812]: I0131 04:45:13.831351 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3"} Jan 31 04:45:13 crc kubenswrapper[4812]: I0131 04:45:13.831794 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d"} Jan 31 04:45:13 crc kubenswrapper[4812]: I0131 04:45:13.831811 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8"} Jan 31 04:45:13 crc kubenswrapper[4812]: I0131 04:45:13.831823 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94"} Jan 31 04:45:15 crc kubenswrapper[4812]: I0131 04:45:15.858116 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc"} Jan 31 04:45:15 crc kubenswrapper[4812]: I0131 04:45:15.858556 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53"} Jan 31 04:45:15 crc kubenswrapper[4812]: I0131 04:45:15.858572 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3"} Jan 31 04:45:15 crc kubenswrapper[4812]: I0131 04:45:15.858584 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1"} Jan 31 04:45:15 crc kubenswrapper[4812]: I0131 04:45:15.858596 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab"} Jan 31 04:45:15 crc kubenswrapper[4812]: I0131 04:45:15.858606 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc"} Jan 31 04:45:16 crc kubenswrapper[4812]: I0131 04:45:16.874201 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerStarted","Data":"54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c"} Jan 31 04:45:16 crc kubenswrapper[4812]: I0131 04:45:16.930701 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/swift-storage-0" podStartSLOduration=36.019772377 podStartE2EDuration="43.930677988s" podCreationTimestamp="2026-01-31 04:44:33 +0000 UTC" firstStartedPulling="2026-01-31 04:45:06.839785128 +0000 UTC m=+1115.334806813" lastFinishedPulling="2026-01-31 04:45:14.750690759 +0000 UTC m=+1123.245712424" observedRunningTime="2026-01-31 04:45:16.924274356 +0000 UTC m=+1125.419296101" watchObservedRunningTime="2026-01-31 04:45:16.930677988 +0000 UTC m=+1125.425699673" Jan 31 04:45:24 crc kubenswrapper[4812]: I0131 04:45:24.661524 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.428875 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-5mwsg"] Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.432264 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.436324 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-0480-account-create-update-kkssk"] Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.437415 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.439133 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.444385 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-5mwsg"] Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.452080 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-0480-account-create-update-kkssk"] Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.452787 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b688963-1f4c-40de-84df-35d1dbb57591-operator-scripts\") pod \"glance-db-create-5mwsg\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.452936 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hjfw8\" (UniqueName: \"kubernetes.io/projected/0b688963-1f4c-40de-84df-35d1dbb57591-kube-api-access-hjfw8\") pod \"glance-db-create-5mwsg\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.452997 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fnbsd\" (UniqueName: \"kubernetes.io/projected/73c60048-6632-444d-bad5-6ed4a867c6a7-kube-api-access-fnbsd\") pod \"glance-0480-account-create-update-kkssk\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.453150 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73c60048-6632-444d-bad5-6ed4a867c6a7-operator-scripts\") pod \"glance-0480-account-create-update-kkssk\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.493244 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.494506 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.497060 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.497253 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.497468 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.497673 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-f54zk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.500248 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.553692 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config-secret\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.553754 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b688963-1f4c-40de-84df-35d1dbb57591-operator-scripts\") pod \"glance-db-create-5mwsg\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.553774 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-scripts\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.553803 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hjfw8\" (UniqueName: \"kubernetes.io/projected/0b688963-1f4c-40de-84df-35d1dbb57591-kube-api-access-hjfw8\") pod \"glance-db-create-5mwsg\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.553892 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcskt\" (UniqueName: \"kubernetes.io/projected/75023eed-958f-468c-83b4-57345fbe3b87-kube-api-access-rcskt\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.553923 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fnbsd\" (UniqueName: \"kubernetes.io/projected/73c60048-6632-444d-bad5-6ed4a867c6a7-kube-api-access-fnbsd\") pod \"glance-0480-account-create-update-kkssk\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.554035 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.554083 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73c60048-6632-444d-bad5-6ed4a867c6a7-operator-scripts\") pod \"glance-0480-account-create-update-kkssk\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.554758 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73c60048-6632-444d-bad5-6ed4a867c6a7-operator-scripts\") pod \"glance-0480-account-create-update-kkssk\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.554908 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b688963-1f4c-40de-84df-35d1dbb57591-operator-scripts\") pod \"glance-db-create-5mwsg\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.573512 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hjfw8\" (UniqueName: \"kubernetes.io/projected/0b688963-1f4c-40de-84df-35d1dbb57591-kube-api-access-hjfw8\") pod \"glance-db-create-5mwsg\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.573785 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fnbsd\" (UniqueName: \"kubernetes.io/projected/73c60048-6632-444d-bad5-6ed4a867c6a7-kube-api-access-fnbsd\") pod \"glance-0480-account-create-update-kkssk\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.656376 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.657351 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config-secret\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.657408 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-scripts\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.657414 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.657453 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rcskt\" (UniqueName: \"kubernetes.io/projected/75023eed-958f-468c-83b4-57345fbe3b87-kube-api-access-rcskt\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.658186 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-scripts\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.661592 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config-secret\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.677741 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcskt\" (UniqueName: \"kubernetes.io/projected/75023eed-958f-468c-83b4-57345fbe3b87-kube-api-access-rcskt\") pod \"openstackclient\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.773175 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.779657 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:28 crc kubenswrapper[4812]: I0131 04:45:28.817998 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.234570 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-5mwsg"] Jan 31 04:45:29 crc kubenswrapper[4812]: W0131 04:45:29.241632 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod0b688963_1f4c_40de_84df_35d1dbb57591.slice/crio-684ffd861e1e2f200154f90a580b4dfe36d3ad296db34b52bfe7f88034f8bda3 WatchSource:0}: Error finding container 684ffd861e1e2f200154f90a580b4dfe36d3ad296db34b52bfe7f88034f8bda3: Status 404 returned error can't find the container with id 684ffd861e1e2f200154f90a580b4dfe36d3ad296db34b52bfe7f88034f8bda3 Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.307118 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-0480-account-create-update-kkssk"] Jan 31 04:45:29 crc kubenswrapper[4812]: W0131 04:45:29.314669 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod73c60048_6632_444d_bad5_6ed4a867c6a7.slice/crio-21b09b8b8266dd88b9018efeb4e509e393ddf1aac0e8f9cd9007a101dc57dd21 WatchSource:0}: Error finding container 21b09b8b8266dd88b9018efeb4e509e393ddf1aac0e8f9cd9007a101dc57dd21: Status 404 returned error can't find the container with id 21b09b8b8266dd88b9018efeb4e509e393ddf1aac0e8f9cd9007a101dc57dd21 Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.358336 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:45:29 crc kubenswrapper[4812]: W0131 04:45:29.379105 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod75023eed_958f_468c_83b4_57345fbe3b87.slice/crio-c51e33e30f0c9f40a67a8cd5853c683fe0d771cdc0d57c2a3c343551de460173 WatchSource:0}: Error finding container c51e33e30f0c9f40a67a8cd5853c683fe0d771cdc0d57c2a3c343551de460173: Status 404 returned error can't find the container with id c51e33e30f0c9f40a67a8cd5853c683fe0d771cdc0d57c2a3c343551de460173 Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.991975 4812 generic.go:334] "Generic (PLEG): container finished" podID="0b688963-1f4c-40de-84df-35d1dbb57591" containerID="bfbd93efb73a65ac1947c8d486ed67493c52be9dbd3c1737d4ae7f15934ab952" exitCode=0 Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.992154 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-5mwsg" event={"ID":"0b688963-1f4c-40de-84df-35d1dbb57591","Type":"ContainerDied","Data":"bfbd93efb73a65ac1947c8d486ed67493c52be9dbd3c1737d4ae7f15934ab952"} Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.992313 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-5mwsg" event={"ID":"0b688963-1f4c-40de-84df-35d1dbb57591","Type":"ContainerStarted","Data":"684ffd861e1e2f200154f90a580b4dfe36d3ad296db34b52bfe7f88034f8bda3"} Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.994330 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"75023eed-958f-468c-83b4-57345fbe3b87","Type":"ContainerStarted","Data":"c51e33e30f0c9f40a67a8cd5853c683fe0d771cdc0d57c2a3c343551de460173"} Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.997308 4812 generic.go:334] "Generic (PLEG): container finished" podID="73c60048-6632-444d-bad5-6ed4a867c6a7" containerID="26873c4d9a3014a080280bd37de48a3d706060c92eb74b2c763c03ae7385eb69" exitCode=0 Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.997358 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" event={"ID":"73c60048-6632-444d-bad5-6ed4a867c6a7","Type":"ContainerDied","Data":"26873c4d9a3014a080280bd37de48a3d706060c92eb74b2c763c03ae7385eb69"} Jan 31 04:45:29 crc kubenswrapper[4812]: I0131 04:45:29.997423 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" event={"ID":"73c60048-6632-444d-bad5-6ed4a867c6a7","Type":"ContainerStarted","Data":"21b09b8b8266dd88b9018efeb4e509e393ddf1aac0e8f9cd9007a101dc57dd21"} Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.368977 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.373058 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.394456 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73c60048-6632-444d-bad5-6ed4a867c6a7-operator-scripts\") pod \"73c60048-6632-444d-bad5-6ed4a867c6a7\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.394504 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b688963-1f4c-40de-84df-35d1dbb57591-operator-scripts\") pod \"0b688963-1f4c-40de-84df-35d1dbb57591\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.394589 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hjfw8\" (UniqueName: \"kubernetes.io/projected/0b688963-1f4c-40de-84df-35d1dbb57591-kube-api-access-hjfw8\") pod \"0b688963-1f4c-40de-84df-35d1dbb57591\" (UID: \"0b688963-1f4c-40de-84df-35d1dbb57591\") " Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.394626 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fnbsd\" (UniqueName: \"kubernetes.io/projected/73c60048-6632-444d-bad5-6ed4a867c6a7-kube-api-access-fnbsd\") pod \"73c60048-6632-444d-bad5-6ed4a867c6a7\" (UID: \"73c60048-6632-444d-bad5-6ed4a867c6a7\") " Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.395279 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/73c60048-6632-444d-bad5-6ed4a867c6a7-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "73c60048-6632-444d-bad5-6ed4a867c6a7" (UID: "73c60048-6632-444d-bad5-6ed4a867c6a7"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.395452 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0b688963-1f4c-40de-84df-35d1dbb57591-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "0b688963-1f4c-40de-84df-35d1dbb57591" (UID: "0b688963-1f4c-40de-84df-35d1dbb57591"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.395872 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/73c60048-6632-444d-bad5-6ed4a867c6a7-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.401112 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0b688963-1f4c-40de-84df-35d1dbb57591-kube-api-access-hjfw8" (OuterVolumeSpecName: "kube-api-access-hjfw8") pod "0b688963-1f4c-40de-84df-35d1dbb57591" (UID: "0b688963-1f4c-40de-84df-35d1dbb57591"). InnerVolumeSpecName "kube-api-access-hjfw8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.411131 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/73c60048-6632-444d-bad5-6ed4a867c6a7-kube-api-access-fnbsd" (OuterVolumeSpecName: "kube-api-access-fnbsd") pod "73c60048-6632-444d-bad5-6ed4a867c6a7" (UID: "73c60048-6632-444d-bad5-6ed4a867c6a7"). InnerVolumeSpecName "kube-api-access-fnbsd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.497847 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/0b688963-1f4c-40de-84df-35d1dbb57591-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.497932 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hjfw8\" (UniqueName: \"kubernetes.io/projected/0b688963-1f4c-40de-84df-35d1dbb57591-kube-api-access-hjfw8\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:31 crc kubenswrapper[4812]: I0131 04:45:31.497948 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fnbsd\" (UniqueName: \"kubernetes.io/projected/73c60048-6632-444d-bad5-6ed4a867c6a7-kube-api-access-fnbsd\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:32 crc kubenswrapper[4812]: I0131 04:45:32.023539 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" Jan 31 04:45:32 crc kubenswrapper[4812]: I0131 04:45:32.023593 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-0480-account-create-update-kkssk" event={"ID":"73c60048-6632-444d-bad5-6ed4a867c6a7","Type":"ContainerDied","Data":"21b09b8b8266dd88b9018efeb4e509e393ddf1aac0e8f9cd9007a101dc57dd21"} Jan 31 04:45:32 crc kubenswrapper[4812]: I0131 04:45:32.023642 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21b09b8b8266dd88b9018efeb4e509e393ddf1aac0e8f9cd9007a101dc57dd21" Jan 31 04:45:32 crc kubenswrapper[4812]: I0131 04:45:32.026921 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-5mwsg" event={"ID":"0b688963-1f4c-40de-84df-35d1dbb57591","Type":"ContainerDied","Data":"684ffd861e1e2f200154f90a580b4dfe36d3ad296db34b52bfe7f88034f8bda3"} Jan 31 04:45:32 crc kubenswrapper[4812]: I0131 04:45:32.026958 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-5mwsg" Jan 31 04:45:32 crc kubenswrapper[4812]: I0131 04:45:32.026958 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="684ffd861e1e2f200154f90a580b4dfe36d3ad296db34b52bfe7f88034f8bda3" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.744364 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-dj59v"] Jan 31 04:45:33 crc kubenswrapper[4812]: E0131 04:45:33.744973 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="73c60048-6632-444d-bad5-6ed4a867c6a7" containerName="mariadb-account-create-update" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.744992 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="73c60048-6632-444d-bad5-6ed4a867c6a7" containerName="mariadb-account-create-update" Jan 31 04:45:33 crc kubenswrapper[4812]: E0131 04:45:33.745014 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0b688963-1f4c-40de-84df-35d1dbb57591" containerName="mariadb-database-create" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.745022 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0b688963-1f4c-40de-84df-35d1dbb57591" containerName="mariadb-database-create" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.745171 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="73c60048-6632-444d-bad5-6ed4a867c6a7" containerName="mariadb-account-create-update" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.745200 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="0b688963-1f4c-40de-84df-35d1dbb57591" containerName="mariadb-database-create" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.745724 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.748161 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.752561 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-qp7k8" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.759858 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dj59v"] Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.845742 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-db-sync-config-data\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.845831 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pmmn5\" (UniqueName: \"kubernetes.io/projected/d5631486-9bad-47a5-9d04-ffd6bfea97e5-kube-api-access-pmmn5\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.846094 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-config-data\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.946886 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-config-data\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.947009 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-db-sync-config-data\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.947044 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pmmn5\" (UniqueName: \"kubernetes.io/projected/d5631486-9bad-47a5-9d04-ffd6bfea97e5-kube-api-access-pmmn5\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.958634 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-db-sync-config-data\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.968392 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pmmn5\" (UniqueName: \"kubernetes.io/projected/d5631486-9bad-47a5-9d04-ffd6bfea97e5-kube-api-access-pmmn5\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:33 crc kubenswrapper[4812]: I0131 04:45:33.969174 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-config-data\") pod \"glance-db-sync-dj59v\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:34 crc kubenswrapper[4812]: I0131 04:45:34.110759 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:37 crc kubenswrapper[4812]: I0131 04:45:37.457563 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dj59v"] Jan 31 04:45:37 crc kubenswrapper[4812]: W0131 04:45:37.469315 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd5631486_9bad_47a5_9d04_ffd6bfea97e5.slice/crio-c1b869047860a873e3a40a67d1f614df86530353ba831747527454dfa00755d2 WatchSource:0}: Error finding container c1b869047860a873e3a40a67d1f614df86530353ba831747527454dfa00755d2: Status 404 returned error can't find the container with id c1b869047860a873e3a40a67d1f614df86530353ba831747527454dfa00755d2 Jan 31 04:45:38 crc kubenswrapper[4812]: I0131 04:45:38.143511 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dj59v" event={"ID":"d5631486-9bad-47a5-9d04-ffd6bfea97e5","Type":"ContainerStarted","Data":"c1b869047860a873e3a40a67d1f614df86530353ba831747527454dfa00755d2"} Jan 31 04:45:38 crc kubenswrapper[4812]: I0131 04:45:38.148389 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"75023eed-958f-468c-83b4-57345fbe3b87","Type":"ContainerStarted","Data":"64ddccba68b6ef44ce96819117d704b4bf34961061bfd14adac0c713e460dd93"} Jan 31 04:45:38 crc kubenswrapper[4812]: I0131 04:45:38.173753 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.403588969 podStartE2EDuration="10.173736003s" podCreationTimestamp="2026-01-31 04:45:28 +0000 UTC" firstStartedPulling="2026-01-31 04:45:29.382299508 +0000 UTC m=+1137.877321203" lastFinishedPulling="2026-01-31 04:45:37.152446572 +0000 UTC m=+1145.647468237" observedRunningTime="2026-01-31 04:45:38.169267543 +0000 UTC m=+1146.664289218" watchObservedRunningTime="2026-01-31 04:45:38.173736003 +0000 UTC m=+1146.668757668" Jan 31 04:45:44 crc kubenswrapper[4812]: I0131 04:45:44.338245 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:45:44 crc kubenswrapper[4812]: I0131 04:45:44.338794 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:45:49 crc kubenswrapper[4812]: I0131 04:45:49.228776 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dj59v" event={"ID":"d5631486-9bad-47a5-9d04-ffd6bfea97e5","Type":"ContainerStarted","Data":"92e99bf11cbc3d5dca231d0ee341676584a6e73f60e8fb652b9df931a5511eef"} Jan 31 04:45:49 crc kubenswrapper[4812]: I0131 04:45:49.254187 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-dj59v" podStartSLOduration=5.476743225 podStartE2EDuration="16.254162741s" podCreationTimestamp="2026-01-31 04:45:33 +0000 UTC" firstStartedPulling="2026-01-31 04:45:37.47280104 +0000 UTC m=+1145.967822735" lastFinishedPulling="2026-01-31 04:45:48.250220586 +0000 UTC m=+1156.745242251" observedRunningTime="2026-01-31 04:45:49.247316788 +0000 UTC m=+1157.742338463" watchObservedRunningTime="2026-01-31 04:45:49.254162741 +0000 UTC m=+1157.749184416" Jan 31 04:45:56 crc kubenswrapper[4812]: I0131 04:45:56.283543 4812 generic.go:334] "Generic (PLEG): container finished" podID="d5631486-9bad-47a5-9d04-ffd6bfea97e5" containerID="92e99bf11cbc3d5dca231d0ee341676584a6e73f60e8fb652b9df931a5511eef" exitCode=0 Jan 31 04:45:56 crc kubenswrapper[4812]: I0131 04:45:56.284140 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dj59v" event={"ID":"d5631486-9bad-47a5-9d04-ffd6bfea97e5","Type":"ContainerDied","Data":"92e99bf11cbc3d5dca231d0ee341676584a6e73f60e8fb652b9df931a5511eef"} Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.546920 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.734601 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pmmn5\" (UniqueName: \"kubernetes.io/projected/d5631486-9bad-47a5-9d04-ffd6bfea97e5-kube-api-access-pmmn5\") pod \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.734679 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-db-sync-config-data\") pod \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.734743 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-config-data\") pod \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\" (UID: \"d5631486-9bad-47a5-9d04-ffd6bfea97e5\") " Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.740528 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "d5631486-9bad-47a5-9d04-ffd6bfea97e5" (UID: "d5631486-9bad-47a5-9d04-ffd6bfea97e5"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.740681 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5631486-9bad-47a5-9d04-ffd6bfea97e5-kube-api-access-pmmn5" (OuterVolumeSpecName: "kube-api-access-pmmn5") pod "d5631486-9bad-47a5-9d04-ffd6bfea97e5" (UID: "d5631486-9bad-47a5-9d04-ffd6bfea97e5"). InnerVolumeSpecName "kube-api-access-pmmn5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.781534 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-config-data" (OuterVolumeSpecName: "config-data") pod "d5631486-9bad-47a5-9d04-ffd6bfea97e5" (UID: "d5631486-9bad-47a5-9d04-ffd6bfea97e5"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.836547 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pmmn5\" (UniqueName: \"kubernetes.io/projected/d5631486-9bad-47a5-9d04-ffd6bfea97e5-kube-api-access-pmmn5\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.836581 4812 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:57 crc kubenswrapper[4812]: I0131 04:45:57.836591 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/d5631486-9bad-47a5-9d04-ffd6bfea97e5-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:45:58 crc kubenswrapper[4812]: I0131 04:45:58.301365 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dj59v" event={"ID":"d5631486-9bad-47a5-9d04-ffd6bfea97e5","Type":"ContainerDied","Data":"c1b869047860a873e3a40a67d1f614df86530353ba831747527454dfa00755d2"} Jan 31 04:45:58 crc kubenswrapper[4812]: I0131 04:45:58.301409 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c1b869047860a873e3a40a67d1f614df86530353ba831747527454dfa00755d2" Jan 31 04:45:58 crc kubenswrapper[4812]: I0131 04:45:58.302082 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dj59v" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.723134 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:45:59 crc kubenswrapper[4812]: E0131 04:45:59.723802 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d5631486-9bad-47a5-9d04-ffd6bfea97e5" containerName="glance-db-sync" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.723820 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5631486-9bad-47a5-9d04-ffd6bfea97e5" containerName="glance-db-sync" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.724043 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d5631486-9bad-47a5-9d04-ffd6bfea97e5" containerName="glance-db-sync" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.755956 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.761200 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.765506 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-qp7k8" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.765911 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.770329 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.800113 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.811806 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.818237 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861227 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-kube-api-access-lmgdg\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861278 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861308 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-lib-modules\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861329 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861360 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-scripts\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861397 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-config-data\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861420 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-run\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861442 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-sys\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861497 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861517 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-logs\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861542 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861566 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861596 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-httpd-run\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.861621 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-dev\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962455 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-config-data\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962518 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-run\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962549 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-sys\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962576 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962607 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-dev\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962637 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962663 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-scripts\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962696 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-run\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962722 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962747 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-config-data\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962783 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962809 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-httpd-run\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962849 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-logs\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962884 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962907 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-lib-modules\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962933 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962959 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.962984 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-logs\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963016 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-httpd-run\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963044 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-dev\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963067 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-btpns\" (UniqueName: \"kubernetes.io/projected/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-kube-api-access-btpns\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963093 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-kube-api-access-lmgdg\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963118 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963143 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-lib-modules\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963174 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963194 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-sys\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963218 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-nvme\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.963247 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-scripts\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964299 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-sys\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964314 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-run\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964361 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-lib-modules\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964364 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964421 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964423 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-nvme\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964608 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-dev\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964647 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964939 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-httpd-run\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.964749 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.965088 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-logs\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.977887 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-scripts\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.978246 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-config-data\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.988184 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-kube-api-access-lmgdg\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.991271 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:45:59 crc kubenswrapper[4812]: I0131 04:45:59.993001 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064180 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064255 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-dev\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064291 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064315 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-scripts\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064328 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064355 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-run\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064386 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064415 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-config-data\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064442 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-httpd-run\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064473 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-lib-modules\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064499 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064524 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064696 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-run\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064781 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064795 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-lib-modules\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064869 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-dev\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064525 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-logs\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064918 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.064966 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-btpns\" (UniqueName: \"kubernetes.io/projected/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-kube-api-access-btpns\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.065024 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-sys\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.065045 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-nvme\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.065129 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-sys\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.065234 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-nvme\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.080874 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.095652 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-logs\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.096148 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-httpd-run\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.099524 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-config-data\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.123695 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-scripts\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.123730 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-btpns\" (UniqueName: \"kubernetes.io/projected/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-kube-api-access-btpns\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.130100 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.137117 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-1\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.398612 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.430470 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:00 crc kubenswrapper[4812]: I0131 04:46:00.892963 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:46:00 crc kubenswrapper[4812]: W0131 04:46:00.897510 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda8d89c30_da1f_40a4_95db_59e8c4d0fb1c.slice/crio-b6c6ad0e5c927379ccc8599e79858d73ec8a0dbf1013aa61628255dc7b8770a4 WatchSource:0}: Error finding container b6c6ad0e5c927379ccc8599e79858d73ec8a0dbf1013aa61628255dc7b8770a4: Status 404 returned error can't find the container with id b6c6ad0e5c927379ccc8599e79858d73ec8a0dbf1013aa61628255dc7b8770a4 Jan 31 04:46:01 crc kubenswrapper[4812]: I0131 04:46:01.334522 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863","Type":"ContainerStarted","Data":"fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943"} Jan 31 04:46:01 crc kubenswrapper[4812]: I0131 04:46:01.335354 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863","Type":"ContainerStarted","Data":"8bb448a3d137ec0dfcfc460a0555065058b80d5e417999b4dc06baf95df6b608"} Jan 31 04:46:01 crc kubenswrapper[4812]: I0131 04:46:01.338652 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c","Type":"ContainerStarted","Data":"ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b"} Jan 31 04:46:01 crc kubenswrapper[4812]: I0131 04:46:01.338694 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c","Type":"ContainerStarted","Data":"b6c6ad0e5c927379ccc8599e79858d73ec8a0dbf1013aa61628255dc7b8770a4"} Jan 31 04:46:02 crc kubenswrapper[4812]: I0131 04:46:02.384121 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863","Type":"ContainerStarted","Data":"7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448"} Jan 31 04:46:02 crc kubenswrapper[4812]: I0131 04:46:02.386558 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c","Type":"ContainerStarted","Data":"3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8"} Jan 31 04:46:02 crc kubenswrapper[4812]: I0131 04:46:02.431468 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.431446987 podStartE2EDuration="3.431446987s" podCreationTimestamp="2026-01-31 04:45:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:46:02.425297962 +0000 UTC m=+1170.920319637" watchObservedRunningTime="2026-01-31 04:46:02.431446987 +0000 UTC m=+1170.926468672" Jan 31 04:46:02 crc kubenswrapper[4812]: I0131 04:46:02.452455 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=4.452433871 podStartE2EDuration="4.452433871s" podCreationTimestamp="2026-01-31 04:45:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:46:02.444775405 +0000 UTC m=+1170.939797100" watchObservedRunningTime="2026-01-31 04:46:02.452433871 +0000 UTC m=+1170.947455556" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.081753 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.082384 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.116524 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.162022 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.431282 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.431340 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.456396 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.460361 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.460413 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.460434 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:10 crc kubenswrapper[4812]: I0131 04:46:10.485414 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:11 crc kubenswrapper[4812]: I0131 04:46:11.466500 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.474332 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.474284 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.474585 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.664413 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.665341 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.703459 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.729209 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:12 crc kubenswrapper[4812]: I0131 04:46:12.763157 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:14 crc kubenswrapper[4812]: I0131 04:46:14.343805 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:46:14 crc kubenswrapper[4812]: I0131 04:46:14.344140 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:46:14 crc kubenswrapper[4812]: I0131 04:46:14.494568 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-log" containerID="cri-o://fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943" gracePeriod=30 Jan 31 04:46:14 crc kubenswrapper[4812]: I0131 04:46:14.496508 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-httpd" containerID="cri-o://7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448" gracePeriod=30 Jan 31 04:46:14 crc kubenswrapper[4812]: I0131 04:46:14.506468 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-single-0" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.99:9292/healthcheck\": EOF" Jan 31 04:46:15 crc kubenswrapper[4812]: I0131 04:46:15.504263 4812 generic.go:334] "Generic (PLEG): container finished" podID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerID="fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943" exitCode=143 Jan 31 04:46:15 crc kubenswrapper[4812]: I0131 04:46:15.504349 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863","Type":"ContainerDied","Data":"fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943"} Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.371137 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432692 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-config-data\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432741 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-logs\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432793 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-scripts\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432811 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-var-locks-brick\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432853 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-nvme\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432895 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-lib-modules\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432915 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-dev\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432951 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-iscsi\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.432977 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-sys\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.433002 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-run\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.433022 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-httpd-run\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.433037 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.433069 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-kube-api-access-lmgdg\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.433093 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\" (UID: \"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863\") " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.434021 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-dev" (OuterVolumeSpecName: "dev") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.434101 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.434101 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-run" (OuterVolumeSpecName: "run") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.434140 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-sys" (OuterVolumeSpecName: "sys") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.434996 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.435578 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-logs" (OuterVolumeSpecName: "logs") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.439761 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.440078 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.440112 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.440139 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.441988 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-kube-api-access-lmgdg" (OuterVolumeSpecName: "kube-api-access-lmgdg") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "kube-api-access-lmgdg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.442046 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-scripts" (OuterVolumeSpecName: "scripts") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.443175 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.492774 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-config-data" (OuterVolumeSpecName: "config-data") pod "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" (UID: "a90da8fc-9cf2-4007-b9a1-4f32a5a3b863"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.530903 4812 generic.go:334] "Generic (PLEG): container finished" podID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerID="7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448" exitCode=0 Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.530955 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863","Type":"ContainerDied","Data":"7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448"} Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.531014 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"a90da8fc-9cf2-4007-b9a1-4f32a5a3b863","Type":"ContainerDied","Data":"8bb448a3d137ec0dfcfc460a0555065058b80d5e417999b4dc06baf95df6b608"} Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.531021 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.531031 4812 scope.go:117] "RemoveContainer" containerID="7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534297 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534317 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534326 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534355 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534364 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lmgdg\" (UniqueName: \"kubernetes.io/projected/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-kube-api-access-lmgdg\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534380 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534388 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534397 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534404 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534412 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534420 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534428 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534435 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.534442 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.548736 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.551980 4812 scope.go:117] "RemoveContainer" containerID="fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.568455 4812 scope.go:117] "RemoveContainer" containerID="7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448" Jan 31 04:46:18 crc kubenswrapper[4812]: E0131 04:46:18.569017 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448\": container with ID starting with 7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448 not found: ID does not exist" containerID="7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.569105 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448"} err="failed to get container status \"7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448\": rpc error: code = NotFound desc = could not find container \"7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448\": container with ID starting with 7b20aa254e2370c325ae3bf15b59fb4f93e7aecfe6a5f9a5f2059b5aa8a64448 not found: ID does not exist" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.569182 4812 scope.go:117] "RemoveContainer" containerID="fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943" Jan 31 04:46:18 crc kubenswrapper[4812]: E0131 04:46:18.569489 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943\": container with ID starting with fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943 not found: ID does not exist" containerID="fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.569511 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943"} err="failed to get container status \"fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943\": rpc error: code = NotFound desc = could not find container \"fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943\": container with ID starting with fe9cfcd04b9afad93a684dc8e3612f3fccac78c757c2c4feff573770a4355943 not found: ID does not exist" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.571015 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.571087 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.577462 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.592469 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:18 crc kubenswrapper[4812]: E0131 04:46:18.592777 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-httpd" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.592798 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-httpd" Jan 31 04:46:18 crc kubenswrapper[4812]: E0131 04:46:18.592817 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-log" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.592827 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-log" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.593005 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-log" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.593026 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" containerName="glance-httpd" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.594117 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.606729 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.640290 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-lib-modules\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.640687 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.640752 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-dev\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.640823 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.640853 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641007 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-sys\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641036 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-httpd-run\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641090 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-scripts\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641123 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641156 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-nvme\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641191 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-run\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641232 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-logs\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641259 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxg2m\" (UniqueName: \"kubernetes.io/projected/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-kube-api-access-jxg2m\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641319 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641341 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-config-data\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.641552 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.661598 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.661604 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.742850 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.742901 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-nvme\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.742934 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-run\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.742956 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-logs\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.742976 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxg2m\" (UniqueName: \"kubernetes.io/projected/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-kube-api-access-jxg2m\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743008 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-config-data\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743029 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-lib-modules\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743023 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743081 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-nvme\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743110 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-dev\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743049 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-dev\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743093 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-run\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743230 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743352 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-sys\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743370 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-httpd-run\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743414 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-scripts\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743490 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-logs\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743540 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-lib-modules\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743572 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-sys\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743605 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.743801 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-httpd-run\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.748280 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-config-data\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.749791 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-scripts\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.763920 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jxg2m\" (UniqueName: \"kubernetes.io/projected/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-kube-api-access-jxg2m\") pod \"glance-default-single-0\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:18 crc kubenswrapper[4812]: I0131 04:46:18.908295 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:19 crc kubenswrapper[4812]: I0131 04:46:19.136265 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:19 crc kubenswrapper[4812]: I0131 04:46:19.538576 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108","Type":"ContainerStarted","Data":"21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7"} Jan 31 04:46:19 crc kubenswrapper[4812]: I0131 04:46:19.538828 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108","Type":"ContainerStarted","Data":"e4588f8728e9a751f5213632e0aa1c6fd3bcf0609ad9341de31c3e2f42573a4a"} Jan 31 04:46:20 crc kubenswrapper[4812]: I0131 04:46:20.363424 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a90da8fc-9cf2-4007-b9a1-4f32a5a3b863" path="/var/lib/kubelet/pods/a90da8fc-9cf2-4007-b9a1-4f32a5a3b863/volumes" Jan 31 04:46:20 crc kubenswrapper[4812]: I0131 04:46:20.547929 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108","Type":"ContainerStarted","Data":"7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a"} Jan 31 04:46:20 crc kubenswrapper[4812]: I0131 04:46:20.581060 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=2.581031735 podStartE2EDuration="2.581031735s" podCreationTimestamp="2026-01-31 04:46:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:46:20.574442099 +0000 UTC m=+1189.069463784" watchObservedRunningTime="2026-01-31 04:46:20.581031735 +0000 UTC m=+1189.076053420" Jan 31 04:46:28 crc kubenswrapper[4812]: I0131 04:46:28.908814 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:28 crc kubenswrapper[4812]: I0131 04:46:28.909472 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:28 crc kubenswrapper[4812]: I0131 04:46:28.950543 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:28 crc kubenswrapper[4812]: I0131 04:46:28.963088 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:29 crc kubenswrapper[4812]: I0131 04:46:29.628898 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:29 crc kubenswrapper[4812]: I0131 04:46:29.628958 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:31 crc kubenswrapper[4812]: I0131 04:46:31.643177 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:31 crc kubenswrapper[4812]: I0131 04:46:31.647114 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:46:31 crc kubenswrapper[4812]: I0131 04:46:31.667021 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.202824 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dj59v"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.213908 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dj59v"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.262672 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.262936 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-log" containerID="cri-o://21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7" gracePeriod=30 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.263002 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-httpd" containerID="cri-o://7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a" gracePeriod=30 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.268153 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.268444 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-log" containerID="cri-o://ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b" gracePeriod=30 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.268571 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-httpd" containerID="cri-o://3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8" gracePeriod=30 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.291350 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance0480-account-delete-66q8q"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.292754 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.310215 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance0480-account-delete-66q8q"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.338488 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.338534 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.338572 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.339021 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"3941ab2783314337d608871e34efeae041c8fd21a85db625d2cda280e4cba1e2"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.339081 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://3941ab2783314337d608871e34efeae041c8fd21a85db625d2cda280e4cba1e2" gracePeriod=600 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.350293 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5631486-9bad-47a5-9d04-ffd6bfea97e5" path="/var/lib/kubelet/pods/d5631486-9bad-47a5-9d04-ffd6bfea97e5/volumes" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.405448 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.407979 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstackclient" podUID="75023eed-958f-468c-83b4-57345fbe3b87" containerName="openstackclient" containerID="cri-o://64ddccba68b6ef44ce96819117d704b4bf34961061bfd14adac0c713e460dd93" gracePeriod=30 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.439227 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d228m\" (UniqueName: \"kubernetes.io/projected/493247e7-7672-44e0-87df-aec6290a3f55-kube-api-access-d228m\") pod \"glance0480-account-delete-66q8q\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.439348 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493247e7-7672-44e0-87df-aec6290a3f55-operator-scripts\") pod \"glance0480-account-delete-66q8q\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.540808 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493247e7-7672-44e0-87df-aec6290a3f55-operator-scripts\") pod \"glance0480-account-delete-66q8q\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.540947 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d228m\" (UniqueName: \"kubernetes.io/projected/493247e7-7672-44e0-87df-aec6290a3f55-kube-api-access-d228m\") pod \"glance0480-account-delete-66q8q\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.542073 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493247e7-7672-44e0-87df-aec6290a3f55-operator-scripts\") pod \"glance0480-account-delete-66q8q\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.560662 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d228m\" (UniqueName: \"kubernetes.io/projected/493247e7-7672-44e0-87df-aec6290a3f55-kube-api-access-d228m\") pod \"glance0480-account-delete-66q8q\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.622771 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.789502 4812 generic.go:334] "Generic (PLEG): container finished" podID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerID="21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7" exitCode=143 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.789552 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108","Type":"ContainerDied","Data":"21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7"} Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.795300 4812 generic.go:334] "Generic (PLEG): container finished" podID="75023eed-958f-468c-83b4-57345fbe3b87" containerID="64ddccba68b6ef44ce96819117d704b4bf34961061bfd14adac0c713e460dd93" exitCode=143 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.795408 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"75023eed-958f-468c-83b4-57345fbe3b87","Type":"ContainerDied","Data":"64ddccba68b6ef44ce96819117d704b4bf34961061bfd14adac0c713e460dd93"} Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.802077 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="3941ab2783314337d608871e34efeae041c8fd21a85db625d2cda280e4cba1e2" exitCode=0 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.802134 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"3941ab2783314337d608871e34efeae041c8fd21a85db625d2cda280e4cba1e2"} Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.802159 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"b6547cb6e9072bd2109b078699f64952c12bea692f52442e858dc4ce59b6b718"} Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.802175 4812 scope.go:117] "RemoveContainer" containerID="1cec58e47f0a7d8b5dd4b0d6f39c2bf781f061b2945554dc684cf3bd87d589d3" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.816068 4812 generic.go:334] "Generic (PLEG): container finished" podID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerID="ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b" exitCode=143 Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.816116 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c","Type":"ContainerDied","Data":"ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b"} Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.824824 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.948596 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config-secret\") pod \"75023eed-958f-468c-83b4-57345fbe3b87\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.949069 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-scripts\") pod \"75023eed-958f-468c-83b4-57345fbe3b87\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.949112 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config\") pod \"75023eed-958f-468c-83b4-57345fbe3b87\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.949485 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcskt\" (UniqueName: \"kubernetes.io/projected/75023eed-958f-468c-83b4-57345fbe3b87-kube-api-access-rcskt\") pod \"75023eed-958f-468c-83b4-57345fbe3b87\" (UID: \"75023eed-958f-468c-83b4-57345fbe3b87\") " Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.950606 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-scripts" (OuterVolumeSpecName: "openstack-scripts") pod "75023eed-958f-468c-83b4-57345fbe3b87" (UID: "75023eed-958f-468c-83b4-57345fbe3b87"). InnerVolumeSpecName "openstack-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.955039 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75023eed-958f-468c-83b4-57345fbe3b87-kube-api-access-rcskt" (OuterVolumeSpecName: "kube-api-access-rcskt") pod "75023eed-958f-468c-83b4-57345fbe3b87" (UID: "75023eed-958f-468c-83b4-57345fbe3b87"). InnerVolumeSpecName "kube-api-access-rcskt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.966513 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config" (OuterVolumeSpecName: "openstack-config") pod "75023eed-958f-468c-83b4-57345fbe3b87" (UID: "75023eed-958f-468c-83b4-57345fbe3b87"). InnerVolumeSpecName "openstack-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:46:44 crc kubenswrapper[4812]: I0131 04:46:44.968206 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config-secret" (OuterVolumeSpecName: "openstack-config-secret") pod "75023eed-958f-468c-83b4-57345fbe3b87" (UID: "75023eed-958f-468c-83b4-57345fbe3b87"). InnerVolumeSpecName "openstack-config-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.051487 4812 reconciler_common.go:293] "Volume detached for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.051515 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rcskt\" (UniqueName: \"kubernetes.io/projected/75023eed-958f-468c-83b4-57345fbe3b87-kube-api-access-rcskt\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.051524 4812 reconciler_common.go:293] "Volume detached for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/75023eed-958f-468c-83b4-57345fbe3b87-openstack-config-secret\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.051531 4812 reconciler_common.go:293] "Volume detached for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/75023eed-958f-468c-83b4-57345fbe3b87-openstack-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.109751 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance0480-account-delete-66q8q"] Jan 31 04:46:45 crc kubenswrapper[4812]: W0131 04:46:45.115122 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493247e7_7672_44e0_87df_aec6290a3f55.slice/crio-d357dbf85d917a6a74d805d4a0c9eebe356334c77cd5d646a238bd07a9db47c0 WatchSource:0}: Error finding container d357dbf85d917a6a74d805d4a0c9eebe356334c77cd5d646a238bd07a9db47c0: Status 404 returned error can't find the container with id d357dbf85d917a6a74d805d4a0c9eebe356334c77cd5d646a238bd07a9db47c0 Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.827372 4812 generic.go:334] "Generic (PLEG): container finished" podID="493247e7-7672-44e0-87df-aec6290a3f55" containerID="1912eb3f1bd65883a6d6c9741bcf7a2a5180f155c29d261cb16b93591689948e" exitCode=0 Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.827434 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" event={"ID":"493247e7-7672-44e0-87df-aec6290a3f55","Type":"ContainerDied","Data":"1912eb3f1bd65883a6d6c9741bcf7a2a5180f155c29d261cb16b93591689948e"} Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.827456 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" event={"ID":"493247e7-7672-44e0-87df-aec6290a3f55","Type":"ContainerStarted","Data":"d357dbf85d917a6a74d805d4a0c9eebe356334c77cd5d646a238bd07a9db47c0"} Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.829039 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"75023eed-958f-468c-83b4-57345fbe3b87","Type":"ContainerDied","Data":"c51e33e30f0c9f40a67a8cd5853c683fe0d771cdc0d57c2a3c343551de460173"} Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.829065 4812 scope.go:117] "RemoveContainer" containerID="64ddccba68b6ef44ce96819117d704b4bf34961061bfd14adac0c713e460dd93" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.829132 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.884117 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:46:45 crc kubenswrapper[4812]: I0131 04:46:45.894219 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:46:46 crc kubenswrapper[4812]: I0131 04:46:46.352289 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75023eed-958f-468c-83b4-57345fbe3b87" path="/var/lib/kubelet/pods/75023eed-958f-468c-83b4-57345fbe3b87/volumes" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.234585 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.393734 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d228m\" (UniqueName: \"kubernetes.io/projected/493247e7-7672-44e0-87df-aec6290a3f55-kube-api-access-d228m\") pod \"493247e7-7672-44e0-87df-aec6290a3f55\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.393978 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493247e7-7672-44e0-87df-aec6290a3f55-operator-scripts\") pod \"493247e7-7672-44e0-87df-aec6290a3f55\" (UID: \"493247e7-7672-44e0-87df-aec6290a3f55\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.394584 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/493247e7-7672-44e0-87df-aec6290a3f55-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "493247e7-7672-44e0-87df-aec6290a3f55" (UID: "493247e7-7672-44e0-87df-aec6290a3f55"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.394723 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/493247e7-7672-44e0-87df-aec6290a3f55-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.404116 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/493247e7-7672-44e0-87df-aec6290a3f55-kube-api-access-d228m" (OuterVolumeSpecName: "kube-api-access-d228m") pod "493247e7-7672-44e0-87df-aec6290a3f55" (UID: "493247e7-7672-44e0-87df-aec6290a3f55"). InnerVolumeSpecName "kube-api-access-d228m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.496082 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d228m\" (UniqueName: \"kubernetes.io/projected/493247e7-7672-44e0-87df-aec6290a3f55-kube-api-access-d228m\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.736281 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.788040 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.850982 4812 generic.go:334] "Generic (PLEG): container finished" podID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerID="3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8" exitCode=0 Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.851053 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c","Type":"ContainerDied","Data":"3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8"} Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.851083 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c","Type":"ContainerDied","Data":"b6c6ad0e5c927379ccc8599e79858d73ec8a0dbf1013aa61628255dc7b8770a4"} Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.851104 4812 scope.go:117] "RemoveContainer" containerID="3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.851212 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.855239 4812 generic.go:334] "Generic (PLEG): container finished" podID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerID="7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a" exitCode=0 Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.855308 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.855304 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108","Type":"ContainerDied","Data":"7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a"} Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.855354 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108","Type":"ContainerDied","Data":"e4588f8728e9a751f5213632e0aa1c6fd3bcf0609ad9341de31c3e2f42573a4a"} Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.857704 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" event={"ID":"493247e7-7672-44e0-87df-aec6290a3f55","Type":"ContainerDied","Data":"d357dbf85d917a6a74d805d4a0c9eebe356334c77cd5d646a238bd07a9db47c0"} Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.857727 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="d357dbf85d917a6a74d805d4a0c9eebe356334c77cd5d646a238bd07a9db47c0" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.857749 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance0480-account-delete-66q8q" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.875966 4812 scope.go:117] "RemoveContainer" containerID="ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.895336 4812 scope.go:117] "RemoveContainer" containerID="3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8" Jan 31 04:46:47 crc kubenswrapper[4812]: E0131 04:46:47.896223 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8\": container with ID starting with 3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8 not found: ID does not exist" containerID="3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.896275 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8"} err="failed to get container status \"3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8\": rpc error: code = NotFound desc = could not find container \"3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8\": container with ID starting with 3bb5cfac8ddb24f49f96bf075c1d9a6997e4555b9ef24fb614ac97790ded17f8 not found: ID does not exist" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.896311 4812 scope.go:117] "RemoveContainer" containerID="ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b" Jan 31 04:46:47 crc kubenswrapper[4812]: E0131 04:46:47.896819 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b\": container with ID starting with ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b not found: ID does not exist" containerID="ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.896872 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b"} err="failed to get container status \"ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b\": rpc error: code = NotFound desc = could not find container \"ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b\": container with ID starting with ba913c8e732965b077533cb2d235711f7dd0eb06180d037e51b5bf995b83440b not found: ID does not exist" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.896898 4812 scope.go:117] "RemoveContainer" containerID="7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903448 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-nvme\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903519 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903650 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-httpd-run\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903662 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903685 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-sys\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903737 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-config-data\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903757 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903880 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903912 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-btpns\" (UniqueName: \"kubernetes.io/projected/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-kube-api-access-btpns\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903962 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-nvme\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.903991 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-logs\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904038 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-run\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904065 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-iscsi\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904121 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-run\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904145 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-dev\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904218 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-var-locks-brick\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904249 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-scripts\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904257 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-sys" (OuterVolumeSpecName: "sys") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904306 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-lib-modules\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904331 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-run" (OuterVolumeSpecName: "run") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904335 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-lib-modules\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904409 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904418 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-iscsi\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904454 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-logs\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904477 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904488 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-dev\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904511 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-run" (OuterVolumeSpecName: "run") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904516 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jxg2m\" (UniqueName: \"kubernetes.io/projected/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-kube-api-access-jxg2m\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904541 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904564 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-sys\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904613 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-scripts\") pod \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\" (UID: \"a8d89c30-da1f-40a4-95db-59e8c4d0fb1c\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904658 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-dev" (OuterVolumeSpecName: "dev") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904665 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-var-locks-brick\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904692 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904702 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-config-data\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904738 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-httpd-run\") pod \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\" (UID: \"9aeb8af5-8d49-4f37-aa4f-541c8ef4c108\") " Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904924 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904968 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.904991 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-dev" (OuterVolumeSpecName: "dev") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905445 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905484 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905503 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905520 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905537 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905555 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905572 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905588 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905905 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-sys" (OuterVolumeSpecName: "sys") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905915 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.905948 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.906796 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.906299 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.906642 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-logs" (OuterVolumeSpecName: "logs") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.906562 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-logs" (OuterVolumeSpecName: "logs") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.906687 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.906991 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.907011 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.908308 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-scripts" (OuterVolumeSpecName: "scripts") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.908721 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.908969 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.909524 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.909660 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.909811 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-kube-api-access-jxg2m" (OuterVolumeSpecName: "kube-api-access-jxg2m") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "kube-api-access-jxg2m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.910095 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-scripts" (OuterVolumeSpecName: "scripts") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.911388 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-kube-api-access-btpns" (OuterVolumeSpecName: "kube-api-access-btpns") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "kube-api-access-btpns". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.922058 4812 scope.go:117] "RemoveContainer" containerID="21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.936159 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-config-data" (OuterVolumeSpecName: "config-data") pod "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" (UID: "a8d89c30-da1f-40a4-95db-59e8c4d0fb1c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.940983 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-config-data" (OuterVolumeSpecName: "config-data") pod "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" (UID: "9aeb8af5-8d49-4f37-aa4f-541c8ef4c108"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.941650 4812 scope.go:117] "RemoveContainer" containerID="7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a" Jan 31 04:46:47 crc kubenswrapper[4812]: E0131 04:46:47.942061 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a\": container with ID starting with 7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a not found: ID does not exist" containerID="7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.942122 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a"} err="failed to get container status \"7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a\": rpc error: code = NotFound desc = could not find container \"7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a\": container with ID starting with 7509e7c472f0101e4465cd7fb40cf6c1398bc301107843ca91015f94d9460c8a not found: ID does not exist" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.942148 4812 scope.go:117] "RemoveContainer" containerID="21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7" Jan 31 04:46:47 crc kubenswrapper[4812]: E0131 04:46:47.942476 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7\": container with ID starting with 21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7 not found: ID does not exist" containerID="21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7" Jan 31 04:46:47 crc kubenswrapper[4812]: I0131 04:46:47.942500 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7"} err="failed to get container status \"21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7\": rpc error: code = NotFound desc = could not find container \"21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7\": container with ID starting with 21083f67289a4c00e0c9a34220337ee736d4bfd69456cafb02acbed315d4f9b7 not found: ID does not exist" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008250 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008306 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008326 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008344 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008364 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxg2m\" (UniqueName: \"kubernetes.io/projected/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-kube-api-access-jxg2m\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008421 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008445 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008464 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008482 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008502 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008521 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008560 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008580 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008606 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008629 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008648 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-btpns\" (UniqueName: \"kubernetes.io/projected/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-kube-api-access-btpns\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.008667 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.022386 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.028827 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.034250 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.038489 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.109939 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.109977 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.109988 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.109995 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.242866 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.252391 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.260934 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.268698 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.355609 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" path="/var/lib/kubelet/pods/9aeb8af5-8d49-4f37-aa4f-541c8ef4c108/volumes" Jan 31 04:46:48 crc kubenswrapper[4812]: I0131 04:46:48.357445 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" path="/var/lib/kubelet/pods/a8d89c30-da1f-40a4-95db-59e8c4d0fb1c/volumes" Jan 31 04:46:49 crc kubenswrapper[4812]: I0131 04:46:49.311335 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-5mwsg"] Jan 31 04:46:49 crc kubenswrapper[4812]: I0131 04:46:49.318029 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-5mwsg"] Jan 31 04:46:49 crc kubenswrapper[4812]: I0131 04:46:49.328260 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance0480-account-delete-66q8q"] Jan 31 04:46:49 crc kubenswrapper[4812]: I0131 04:46:49.337253 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-0480-account-create-update-kkssk"] Jan 31 04:46:49 crc kubenswrapper[4812]: I0131 04:46:49.344977 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance0480-account-delete-66q8q"] Jan 31 04:46:49 crc kubenswrapper[4812]: I0131 04:46:49.351668 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-0480-account-create-update-kkssk"] Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.349598 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0b688963-1f4c-40de-84df-35d1dbb57591" path="/var/lib/kubelet/pods/0b688963-1f4c-40de-84df-35d1dbb57591/volumes" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.350397 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="493247e7-7672-44e0-87df-aec6290a3f55" path="/var/lib/kubelet/pods/493247e7-7672-44e0-87df-aec6290a3f55/volumes" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.350872 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="73c60048-6632-444d-bad5-6ed4a867c6a7" path="/var/lib/kubelet/pods/73c60048-6632-444d-bad5-6ed4a867c6a7/volumes" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.493387 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2"] Jan 31 04:46:50 crc kubenswrapper[4812]: E0131 04:46:50.494033 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-log" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494071 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-log" Jan 31 04:46:50 crc kubenswrapper[4812]: E0131 04:46:50.494089 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-httpd" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494105 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-httpd" Jan 31 04:46:50 crc kubenswrapper[4812]: E0131 04:46:50.494144 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="75023eed-958f-468c-83b4-57345fbe3b87" containerName="openstackclient" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494161 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="75023eed-958f-468c-83b4-57345fbe3b87" containerName="openstackclient" Jan 31 04:46:50 crc kubenswrapper[4812]: E0131 04:46:50.494189 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-log" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494205 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-log" Jan 31 04:46:50 crc kubenswrapper[4812]: E0131 04:46:50.494243 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="493247e7-7672-44e0-87df-aec6290a3f55" containerName="mariadb-account-delete" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494259 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="493247e7-7672-44e0-87df-aec6290a3f55" containerName="mariadb-account-delete" Jan 31 04:46:50 crc kubenswrapper[4812]: E0131 04:46:50.494288 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-httpd" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494305 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-httpd" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494629 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-log" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494664 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-httpd" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494703 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a8d89c30-da1f-40a4-95db-59e8c4d0fb1c" containerName="glance-httpd" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494728 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="493247e7-7672-44e0-87df-aec6290a3f55" containerName="mariadb-account-delete" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494755 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9aeb8af5-8d49-4f37-aa4f-541c8ef4c108" containerName="glance-log" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.494775 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="75023eed-958f-468c-83b4-57345fbe3b87" containerName="openstackclient" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.495793 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.499199 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-bzpzc"] Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.501061 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.501302 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.506551 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2"] Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.513167 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-bzpzc"] Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.647331 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lchd7\" (UniqueName: \"kubernetes.io/projected/9e7e7548-7925-49f0-99ce-000ee043788a-kube-api-access-lchd7\") pod \"glance-db-create-bzpzc\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.647534 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c990aff-f3f1-480e-a40d-2426778a990b-operator-scripts\") pod \"glance-9ae3-account-create-update-f8wh2\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.647601 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p2pxs\" (UniqueName: \"kubernetes.io/projected/7c990aff-f3f1-480e-a40d-2426778a990b-kube-api-access-p2pxs\") pod \"glance-9ae3-account-create-update-f8wh2\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.647794 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e7e7548-7925-49f0-99ce-000ee043788a-operator-scripts\") pod \"glance-db-create-bzpzc\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.749595 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lchd7\" (UniqueName: \"kubernetes.io/projected/9e7e7548-7925-49f0-99ce-000ee043788a-kube-api-access-lchd7\") pod \"glance-db-create-bzpzc\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.749745 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c990aff-f3f1-480e-a40d-2426778a990b-operator-scripts\") pod \"glance-9ae3-account-create-update-f8wh2\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.749789 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-p2pxs\" (UniqueName: \"kubernetes.io/projected/7c990aff-f3f1-480e-a40d-2426778a990b-kube-api-access-p2pxs\") pod \"glance-9ae3-account-create-update-f8wh2\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.749883 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e7e7548-7925-49f0-99ce-000ee043788a-operator-scripts\") pod \"glance-db-create-bzpzc\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.751317 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e7e7548-7925-49f0-99ce-000ee043788a-operator-scripts\") pod \"glance-db-create-bzpzc\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.751408 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c990aff-f3f1-480e-a40d-2426778a990b-operator-scripts\") pod \"glance-9ae3-account-create-update-f8wh2\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.774944 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lchd7\" (UniqueName: \"kubernetes.io/projected/9e7e7548-7925-49f0-99ce-000ee043788a-kube-api-access-lchd7\") pod \"glance-db-create-bzpzc\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.777486 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-p2pxs\" (UniqueName: \"kubernetes.io/projected/7c990aff-f3f1-480e-a40d-2426778a990b-kube-api-access-p2pxs\") pod \"glance-9ae3-account-create-update-f8wh2\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.822433 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:50 crc kubenswrapper[4812]: I0131 04:46:50.836017 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.052038 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2"] Jan 31 04:46:51 crc kubenswrapper[4812]: W0131 04:46:51.064018 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod7c990aff_f3f1_480e_a40d_2426778a990b.slice/crio-13b42f906e30f5759e55b33789e605572a810da4d8ed20acd2d348d898d34ace WatchSource:0}: Error finding container 13b42f906e30f5759e55b33789e605572a810da4d8ed20acd2d348d898d34ace: Status 404 returned error can't find the container with id 13b42f906e30f5759e55b33789e605572a810da4d8ed20acd2d348d898d34ace Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.325719 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-bzpzc"] Jan 31 04:46:51 crc kubenswrapper[4812]: W0131 04:46:51.331431 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9e7e7548_7925_49f0_99ce_000ee043788a.slice/crio-82b3f26321ad5e52195235d35929f648717e04bdd47bb0941e634890dfd8718d WatchSource:0}: Error finding container 82b3f26321ad5e52195235d35929f648717e04bdd47bb0941e634890dfd8718d: Status 404 returned error can't find the container with id 82b3f26321ad5e52195235d35929f648717e04bdd47bb0941e634890dfd8718d Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.898780 4812 generic.go:334] "Generic (PLEG): container finished" podID="7c990aff-f3f1-480e-a40d-2426778a990b" containerID="6d8832b0bc6414e5e64d767b2c3eff38c263d5767886a75c27a33f8f6913b33d" exitCode=0 Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.898946 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" event={"ID":"7c990aff-f3f1-480e-a40d-2426778a990b","Type":"ContainerDied","Data":"6d8832b0bc6414e5e64d767b2c3eff38c263d5767886a75c27a33f8f6913b33d"} Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.898992 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" event={"ID":"7c990aff-f3f1-480e-a40d-2426778a990b","Type":"ContainerStarted","Data":"13b42f906e30f5759e55b33789e605572a810da4d8ed20acd2d348d898d34ace"} Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.900697 4812 generic.go:334] "Generic (PLEG): container finished" podID="9e7e7548-7925-49f0-99ce-000ee043788a" containerID="877083d9b518409a463edb666d633e4c7ccd9f471cd9f50026d9061a729273c0" exitCode=0 Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.900752 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-bzpzc" event={"ID":"9e7e7548-7925-49f0-99ce-000ee043788a","Type":"ContainerDied","Data":"877083d9b518409a463edb666d633e4c7ccd9f471cd9f50026d9061a729273c0"} Jan 31 04:46:51 crc kubenswrapper[4812]: I0131 04:46:51.900783 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-bzpzc" event={"ID":"9e7e7548-7925-49f0-99ce-000ee043788a","Type":"ContainerStarted","Data":"82b3f26321ad5e52195235d35929f648717e04bdd47bb0941e634890dfd8718d"} Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.251651 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.259326 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.404251 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p2pxs\" (UniqueName: \"kubernetes.io/projected/7c990aff-f3f1-480e-a40d-2426778a990b-kube-api-access-p2pxs\") pod \"7c990aff-f3f1-480e-a40d-2426778a990b\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.404307 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lchd7\" (UniqueName: \"kubernetes.io/projected/9e7e7548-7925-49f0-99ce-000ee043788a-kube-api-access-lchd7\") pod \"9e7e7548-7925-49f0-99ce-000ee043788a\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.404331 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c990aff-f3f1-480e-a40d-2426778a990b-operator-scripts\") pod \"7c990aff-f3f1-480e-a40d-2426778a990b\" (UID: \"7c990aff-f3f1-480e-a40d-2426778a990b\") " Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.404412 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e7e7548-7925-49f0-99ce-000ee043788a-operator-scripts\") pod \"9e7e7548-7925-49f0-99ce-000ee043788a\" (UID: \"9e7e7548-7925-49f0-99ce-000ee043788a\") " Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.405441 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9e7e7548-7925-49f0-99ce-000ee043788a-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9e7e7548-7925-49f0-99ce-000ee043788a" (UID: "9e7e7548-7925-49f0-99ce-000ee043788a"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.405511 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7c990aff-f3f1-480e-a40d-2426778a990b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7c990aff-f3f1-480e-a40d-2426778a990b" (UID: "7c990aff-f3f1-480e-a40d-2426778a990b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.410568 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7c990aff-f3f1-480e-a40d-2426778a990b-kube-api-access-p2pxs" (OuterVolumeSpecName: "kube-api-access-p2pxs") pod "7c990aff-f3f1-480e-a40d-2426778a990b" (UID: "7c990aff-f3f1-480e-a40d-2426778a990b"). InnerVolumeSpecName "kube-api-access-p2pxs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.414016 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9e7e7548-7925-49f0-99ce-000ee043788a-kube-api-access-lchd7" (OuterVolumeSpecName: "kube-api-access-lchd7") pod "9e7e7548-7925-49f0-99ce-000ee043788a" (UID: "9e7e7548-7925-49f0-99ce-000ee043788a"). InnerVolumeSpecName "kube-api-access-lchd7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.505996 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-p2pxs\" (UniqueName: \"kubernetes.io/projected/7c990aff-f3f1-480e-a40d-2426778a990b-kube-api-access-p2pxs\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.506034 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lchd7\" (UniqueName: \"kubernetes.io/projected/9e7e7548-7925-49f0-99ce-000ee043788a-kube-api-access-lchd7\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.506046 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7c990aff-f3f1-480e-a40d-2426778a990b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.506058 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9e7e7548-7925-49f0-99ce-000ee043788a-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.922774 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" event={"ID":"7c990aff-f3f1-480e-a40d-2426778a990b","Type":"ContainerDied","Data":"13b42f906e30f5759e55b33789e605572a810da4d8ed20acd2d348d898d34ace"} Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.922820 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.922887 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="13b42f906e30f5759e55b33789e605572a810da4d8ed20acd2d348d898d34ace" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.924715 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-bzpzc" event={"ID":"9e7e7548-7925-49f0-99ce-000ee043788a","Type":"ContainerDied","Data":"82b3f26321ad5e52195235d35929f648717e04bdd47bb0941e634890dfd8718d"} Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.924767 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="82b3f26321ad5e52195235d35929f648717e04bdd47bb0941e634890dfd8718d" Jan 31 04:46:53 crc kubenswrapper[4812]: I0131 04:46:53.924773 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-bzpzc" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.637963 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-nfwkt"] Jan 31 04:46:55 crc kubenswrapper[4812]: E0131 04:46:55.639266 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7c990aff-f3f1-480e-a40d-2426778a990b" containerName="mariadb-account-create-update" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.639334 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7c990aff-f3f1-480e-a40d-2426778a990b" containerName="mariadb-account-create-update" Jan 31 04:46:55 crc kubenswrapper[4812]: E0131 04:46:55.639407 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9e7e7548-7925-49f0-99ce-000ee043788a" containerName="mariadb-database-create" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.639464 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9e7e7548-7925-49f0-99ce-000ee043788a" containerName="mariadb-database-create" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.639651 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9e7e7548-7925-49f0-99ce-000ee043788a" containerName="mariadb-database-create" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.639726 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="7c990aff-f3f1-480e-a40d-2426778a990b" containerName="mariadb-account-create-update" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.642451 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.644816 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-k4922" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.645115 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.645171 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.655917 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-nfwkt"] Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.746898 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-config-data\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.746984 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-db-sync-config-data\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.747036 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lm748\" (UniqueName: \"kubernetes.io/projected/1d44fec3-78e6-40fd-9d28-c5843f96e777-kube-api-access-lm748\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.747059 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-combined-ca-bundle\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.848997 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-db-sync-config-data\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.849136 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lm748\" (UniqueName: \"kubernetes.io/projected/1d44fec3-78e6-40fd-9d28-c5843f96e777-kube-api-access-lm748\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.849183 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-combined-ca-bundle\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.849312 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-config-data\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.856175 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-config-data\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.857280 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-combined-ca-bundle\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.857707 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-db-sync-config-data\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.894726 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lm748\" (UniqueName: \"kubernetes.io/projected/1d44fec3-78e6-40fd-9d28-c5843f96e777-kube-api-access-lm748\") pod \"glance-db-sync-nfwkt\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:55 crc kubenswrapper[4812]: I0131 04:46:55.962067 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:46:56 crc kubenswrapper[4812]: I0131 04:46:56.185594 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-nfwkt"] Jan 31 04:46:56 crc kubenswrapper[4812]: W0131 04:46:56.190556 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1d44fec3_78e6_40fd_9d28_c5843f96e777.slice/crio-b1f1811dca1edfedd0bf1d4369c333b1bdd6d93a9fdf803fe7cd0482405d0853 WatchSource:0}: Error finding container b1f1811dca1edfedd0bf1d4369c333b1bdd6d93a9fdf803fe7cd0482405d0853: Status 404 returned error can't find the container with id b1f1811dca1edfedd0bf1d4369c333b1bdd6d93a9fdf803fe7cd0482405d0853 Jan 31 04:46:56 crc kubenswrapper[4812]: I0131 04:46:56.948585 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-nfwkt" event={"ID":"1d44fec3-78e6-40fd-9d28-c5843f96e777","Type":"ContainerStarted","Data":"be533b808bb3e368e8424c4813d13d7c98276f784fdfd2562862634ebbb30094"} Jan 31 04:46:56 crc kubenswrapper[4812]: I0131 04:46:56.948956 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-nfwkt" event={"ID":"1d44fec3-78e6-40fd-9d28-c5843f96e777","Type":"ContainerStarted","Data":"b1f1811dca1edfedd0bf1d4369c333b1bdd6d93a9fdf803fe7cd0482405d0853"} Jan 31 04:46:56 crc kubenswrapper[4812]: I0131 04:46:56.966307 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-nfwkt" podStartSLOduration=1.966291183 podStartE2EDuration="1.966291183s" podCreationTimestamp="2026-01-31 04:46:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:46:56.964055654 +0000 UTC m=+1225.459077339" watchObservedRunningTime="2026-01-31 04:46:56.966291183 +0000 UTC m=+1225.461312838" Jan 31 04:46:59 crc kubenswrapper[4812]: I0131 04:46:59.970556 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d44fec3-78e6-40fd-9d28-c5843f96e777" containerID="be533b808bb3e368e8424c4813d13d7c98276f784fdfd2562862634ebbb30094" exitCode=0 Jan 31 04:46:59 crc kubenswrapper[4812]: I0131 04:46:59.970615 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-nfwkt" event={"ID":"1d44fec3-78e6-40fd-9d28-c5843f96e777","Type":"ContainerDied","Data":"be533b808bb3e368e8424c4813d13d7c98276f784fdfd2562862634ebbb30094"} Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.378017 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.528822 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-db-sync-config-data\") pod \"1d44fec3-78e6-40fd-9d28-c5843f96e777\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.529591 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-config-data\") pod \"1d44fec3-78e6-40fd-9d28-c5843f96e777\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.538949 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-combined-ca-bundle\") pod \"1d44fec3-78e6-40fd-9d28-c5843f96e777\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.539261 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lm748\" (UniqueName: \"kubernetes.io/projected/1d44fec3-78e6-40fd-9d28-c5843f96e777-kube-api-access-lm748\") pod \"1d44fec3-78e6-40fd-9d28-c5843f96e777\" (UID: \"1d44fec3-78e6-40fd-9d28-c5843f96e777\") " Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.539150 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1d44fec3-78e6-40fd-9d28-c5843f96e777" (UID: "1d44fec3-78e6-40fd-9d28-c5843f96e777"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.540341 4812 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.543230 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d44fec3-78e6-40fd-9d28-c5843f96e777-kube-api-access-lm748" (OuterVolumeSpecName: "kube-api-access-lm748") pod "1d44fec3-78e6-40fd-9d28-c5843f96e777" (UID: "1d44fec3-78e6-40fd-9d28-c5843f96e777"). InnerVolumeSpecName "kube-api-access-lm748". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.568138 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "1d44fec3-78e6-40fd-9d28-c5843f96e777" (UID: "1d44fec3-78e6-40fd-9d28-c5843f96e777"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.604417 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-config-data" (OuterVolumeSpecName: "config-data") pod "1d44fec3-78e6-40fd-9d28-c5843f96e777" (UID: "1d44fec3-78e6-40fd-9d28-c5843f96e777"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.642465 4812 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.642505 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lm748\" (UniqueName: \"kubernetes.io/projected/1d44fec3-78e6-40fd-9d28-c5843f96e777-kube-api-access-lm748\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.642520 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d44fec3-78e6-40fd-9d28-c5843f96e777-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.991722 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-nfwkt" event={"ID":"1d44fec3-78e6-40fd-9d28-c5843f96e777","Type":"ContainerDied","Data":"b1f1811dca1edfedd0bf1d4369c333b1bdd6d93a9fdf803fe7cd0482405d0853"} Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.992055 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b1f1811dca1edfedd0bf1d4369c333b1bdd6d93a9fdf803fe7cd0482405d0853" Jan 31 04:47:01 crc kubenswrapper[4812]: I0131 04:47:01.991813 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-nfwkt" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.414787 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:02 crc kubenswrapper[4812]: E0131 04:47:02.415232 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d44fec3-78e6-40fd-9d28-c5843f96e777" containerName="glance-db-sync" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.415252 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d44fec3-78e6-40fd-9d28-c5843f96e777" containerName="glance-db-sync" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.415508 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d44fec3-78e6-40fd-9d28-c5843f96e777" containerName="glance-db-sync" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.416584 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.418715 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.418827 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-k4922" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.419136 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.419264 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.419439 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.419749 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.431415 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.559797 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-logs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.559854 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-config-data\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.559881 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.559906 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rzjw5\" (UniqueName: \"kubernetes.io/projected/025a220f-b254-4b50-8036-7aebb692d60c-kube-api-access-rzjw5\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.559931 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-httpd-run\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.559972 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.560001 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-scripts\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.560117 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.560283 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662076 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-config-data\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662137 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-logs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662183 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662227 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rzjw5\" (UniqueName: \"kubernetes.io/projected/025a220f-b254-4b50-8036-7aebb692d60c-kube-api-access-rzjw5\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662274 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-httpd-run\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662323 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662406 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-scripts\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662462 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662548 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.662896 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-httpd-run\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.663021 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.663036 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-logs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.667520 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-scripts\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.667749 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.668361 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.668436 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-config-data\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.671620 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.692044 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rzjw5\" (UniqueName: \"kubernetes.io/projected/025a220f-b254-4b50-8036-7aebb692d60c-kube-api-access-rzjw5\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.701502 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:02 crc kubenswrapper[4812]: I0131 04:47:02.734815 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:03 crc kubenswrapper[4812]: I0131 04:47:03.028390 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:03 crc kubenswrapper[4812]: I0131 04:47:03.402818 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:04 crc kubenswrapper[4812]: I0131 04:47:04.015326 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"025a220f-b254-4b50-8036-7aebb692d60c","Type":"ContainerStarted","Data":"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f"} Jan 31 04:47:04 crc kubenswrapper[4812]: I0131 04:47:04.015670 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"025a220f-b254-4b50-8036-7aebb692d60c","Type":"ContainerStarted","Data":"56eab03fe24eba6a2738f18bb2799b1ef15c1770ebebae073da09c5902b5a03a"} Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.026150 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"025a220f-b254-4b50-8036-7aebb692d60c","Type":"ContainerStarted","Data":"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee"} Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.026552 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-httpd" containerID="cri-o://54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee" gracePeriod=30 Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.027035 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-log" containerID="cri-o://017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f" gracePeriod=30 Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.052517 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.052496081 podStartE2EDuration="3.052496081s" podCreationTimestamp="2026-01-31 04:47:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:05.048192144 +0000 UTC m=+1233.543213809" watchObservedRunningTime="2026-01-31 04:47:05.052496081 +0000 UTC m=+1233.547517756" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.586920 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708103 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rzjw5\" (UniqueName: \"kubernetes.io/projected/025a220f-b254-4b50-8036-7aebb692d60c-kube-api-access-rzjw5\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708160 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-public-tls-certs\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708210 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-logs\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708279 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-scripts\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708345 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-httpd-run\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708433 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-internal-tls-certs\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708488 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-combined-ca-bundle\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708571 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708624 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-config-data\") pod \"025a220f-b254-4b50-8036-7aebb692d60c\" (UID: \"025a220f-b254-4b50-8036-7aebb692d60c\") " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.708790 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-logs" (OuterVolumeSpecName: "logs") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.709203 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.709201 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.714546 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-scripts" (OuterVolumeSpecName: "scripts") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.726632 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/025a220f-b254-4b50-8036-7aebb692d60c-kube-api-access-rzjw5" (OuterVolumeSpecName: "kube-api-access-rzjw5") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "kube-api-access-rzjw5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.730353 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.740623 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.748411 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.748484 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.765232 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-config-data" (OuterVolumeSpecName: "config-data") pod "025a220f-b254-4b50-8036-7aebb692d60c" (UID: "025a220f-b254-4b50-8036-7aebb692d60c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810423 4812 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810465 4812 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810504 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810518 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810531 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rzjw5\" (UniqueName: \"kubernetes.io/projected/025a220f-b254-4b50-8036-7aebb692d60c-kube-api-access-rzjw5\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810545 4812 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810557 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/025a220f-b254-4b50-8036-7aebb692d60c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.810568 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/025a220f-b254-4b50-8036-7aebb692d60c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.827408 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:47:05 crc kubenswrapper[4812]: I0131 04:47:05.912079 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041639 4812 generic.go:334] "Generic (PLEG): container finished" podID="025a220f-b254-4b50-8036-7aebb692d60c" containerID="54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee" exitCode=0 Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041684 4812 generic.go:334] "Generic (PLEG): container finished" podID="025a220f-b254-4b50-8036-7aebb692d60c" containerID="017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f" exitCode=143 Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041718 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"025a220f-b254-4b50-8036-7aebb692d60c","Type":"ContainerDied","Data":"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee"} Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041729 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041759 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"025a220f-b254-4b50-8036-7aebb692d60c","Type":"ContainerDied","Data":"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f"} Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041781 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"025a220f-b254-4b50-8036-7aebb692d60c","Type":"ContainerDied","Data":"56eab03fe24eba6a2738f18bb2799b1ef15c1770ebebae073da09c5902b5a03a"} Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.041809 4812 scope.go:117] "RemoveContainer" containerID="54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.074678 4812 scope.go:117] "RemoveContainer" containerID="017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.102583 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.110594 4812 scope.go:117] "RemoveContainer" containerID="54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee" Jan 31 04:47:06 crc kubenswrapper[4812]: E0131 04:47:06.111358 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee\": container with ID starting with 54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee not found: ID does not exist" containerID="54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.111419 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee"} err="failed to get container status \"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee\": rpc error: code = NotFound desc = could not find container \"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee\": container with ID starting with 54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee not found: ID does not exist" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.111463 4812 scope.go:117] "RemoveContainer" containerID="017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f" Jan 31 04:47:06 crc kubenswrapper[4812]: E0131 04:47:06.112033 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f\": container with ID starting with 017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f not found: ID does not exist" containerID="017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.112097 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f"} err="failed to get container status \"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f\": rpc error: code = NotFound desc = could not find container \"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f\": container with ID starting with 017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f not found: ID does not exist" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.112138 4812 scope.go:117] "RemoveContainer" containerID="54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.113503 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee"} err="failed to get container status \"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee\": rpc error: code = NotFound desc = could not find container \"54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee\": container with ID starting with 54b8040b859bec36078880031809e3d8cce21e9907bd325297d4635f4fcc6dee not found: ID does not exist" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.113801 4812 scope.go:117] "RemoveContainer" containerID="017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.114417 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f"} err="failed to get container status \"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f\": rpc error: code = NotFound desc = could not find container \"017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f\": container with ID starting with 017d57d97d8ee0e918f7392f081bce3e6bcfa44fc8700be35b10cf94067bcf5f not found: ID does not exist" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.120430 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.133010 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:06 crc kubenswrapper[4812]: E0131 04:47:06.133549 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-log" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.133580 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-log" Jan 31 04:47:06 crc kubenswrapper[4812]: E0131 04:47:06.133597 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-httpd" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.133613 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-httpd" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.133907 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-log" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.133927 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="025a220f-b254-4b50-8036-7aebb692d60c" containerName="glance-httpd" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.134919 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.137447 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.138114 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.138282 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"combined-ca-bundle" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.138410 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-internal-svc" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.139254 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"cert-glance-default-public-svc" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.139975 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-k4922" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.176389 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317321 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-scripts\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317399 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7wgms\" (UniqueName: \"kubernetes.io/projected/05459308-85cb-472c-a0e5-00843e8070bd-kube-api-access-7wgms\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317434 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317472 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-config-data\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317527 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317615 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-logs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317647 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317673 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.317721 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-httpd-run\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.347027 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="025a220f-b254-4b50-8036-7aebb692d60c" path="/var/lib/kubelet/pods/025a220f-b254-4b50-8036-7aebb692d60c/volumes" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419134 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-logs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419186 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419210 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419243 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-httpd-run\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419270 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-scripts\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419296 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7wgms\" (UniqueName: \"kubernetes.io/projected/05459308-85cb-472c-a0e5-00843e8070bd-kube-api-access-7wgms\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419319 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419345 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-config-data\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419359 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.419539 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.420028 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-logs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.420185 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-httpd-run\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.423050 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-public-tls-certs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.423273 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-scripts\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.424615 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-config-data\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.426472 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-internal-tls-certs\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.427302 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-combined-ca-bundle\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.436756 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7wgms\" (UniqueName: \"kubernetes.io/projected/05459308-85cb-472c-a0e5-00843e8070bd-kube-api-access-7wgms\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.461402 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-single-0\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:06 crc kubenswrapper[4812]: I0131 04:47:06.519430 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:07 crc kubenswrapper[4812]: I0131 04:47:06.999336 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:07 crc kubenswrapper[4812]: W0131 04:47:07.000111 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod05459308_85cb_472c_a0e5_00843e8070bd.slice/crio-3c40a33a46d1e97d298b0eba01f9528230e8f1ff7d27e2b92d402df9b6617e3c WatchSource:0}: Error finding container 3c40a33a46d1e97d298b0eba01f9528230e8f1ff7d27e2b92d402df9b6617e3c: Status 404 returned error can't find the container with id 3c40a33a46d1e97d298b0eba01f9528230e8f1ff7d27e2b92d402df9b6617e3c Jan 31 04:47:07 crc kubenswrapper[4812]: I0131 04:47:07.052183 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"05459308-85cb-472c-a0e5-00843e8070bd","Type":"ContainerStarted","Data":"3c40a33a46d1e97d298b0eba01f9528230e8f1ff7d27e2b92d402df9b6617e3c"} Jan 31 04:47:08 crc kubenswrapper[4812]: I0131 04:47:08.063500 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"05459308-85cb-472c-a0e5-00843e8070bd","Type":"ContainerStarted","Data":"4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc"} Jan 31 04:47:09 crc kubenswrapper[4812]: I0131 04:47:09.073543 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"05459308-85cb-472c-a0e5-00843e8070bd","Type":"ContainerStarted","Data":"067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b"} Jan 31 04:47:09 crc kubenswrapper[4812]: I0131 04:47:09.099748 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=3.099724194 podStartE2EDuration="3.099724194s" podCreationTimestamp="2026-01-31 04:47:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:09.093370194 +0000 UTC m=+1237.588391859" watchObservedRunningTime="2026-01-31 04:47:09.099724194 +0000 UTC m=+1237.594745869" Jan 31 04:47:16 crc kubenswrapper[4812]: I0131 04:47:16.519931 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:16 crc kubenswrapper[4812]: I0131 04:47:16.520584 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:16 crc kubenswrapper[4812]: I0131 04:47:16.564511 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:16 crc kubenswrapper[4812]: I0131 04:47:16.566997 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:17 crc kubenswrapper[4812]: I0131 04:47:17.162454 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:17 crc kubenswrapper[4812]: I0131 04:47:17.162514 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.042439 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.043759 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.867579 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-nfwkt"] Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.881667 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-nfwkt"] Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.953352 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance9ae3-account-delete-tx6xt"] Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.954654 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.977085 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:19 crc kubenswrapper[4812]: I0131 04:47:19.985716 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance9ae3-account-delete-tx6xt"] Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.045354 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-operator-scripts\") pod \"glance9ae3-account-delete-tx6xt\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.045411 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4sm9v\" (UniqueName: \"kubernetes.io/projected/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-kube-api-access-4sm9v\") pod \"glance9ae3-account-delete-tx6xt\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.147033 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-operator-scripts\") pod \"glance9ae3-account-delete-tx6xt\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.147089 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4sm9v\" (UniqueName: \"kubernetes.io/projected/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-kube-api-access-4sm9v\") pod \"glance9ae3-account-delete-tx6xt\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.148100 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-operator-scripts\") pod \"glance9ae3-account-delete-tx6xt\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.167979 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4sm9v\" (UniqueName: \"kubernetes.io/projected/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-kube-api-access-4sm9v\") pod \"glance9ae3-account-delete-tx6xt\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.186499 4812 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/glance-default-single-0" secret="" err="secret \"glance-glance-dockercfg-k4922\" not found" Jan 31 04:47:20 crc kubenswrapper[4812]: E0131 04:47:20.198069 4812 desired_state_of_world_populator.go:312] "Error processing volume" err="error processing PVC glance-kuttl-tests/glance-glance-default-single-0: PVC is being deleted" pod="glance-kuttl-tests/glance-default-single-0" volumeName="glance" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.279411 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.349134 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d44fec3-78e6-40fd-9d28-c5843f96e777" path="/var/lib/kubelet/pods/1d44fec3-78e6-40fd-9d28-c5843f96e777/volumes" Jan 31 04:47:20 crc kubenswrapper[4812]: I0131 04:47:20.719908 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance9ae3-account-delete-tx6xt"] Jan 31 04:47:21 crc kubenswrapper[4812]: I0131 04:47:21.195563 4812 generic.go:334] "Generic (PLEG): container finished" podID="17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" containerID="0e5337d2ce3736a25cf57e4b0ff878deae238372e13026ef171c21b721a2b7da" exitCode=0 Jan 31 04:47:21 crc kubenswrapper[4812]: I0131 04:47:21.195617 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" event={"ID":"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1","Type":"ContainerDied","Data":"0e5337d2ce3736a25cf57e4b0ff878deae238372e13026ef171c21b721a2b7da"} Jan 31 04:47:21 crc kubenswrapper[4812]: I0131 04:47:21.195972 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" event={"ID":"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1","Type":"ContainerStarted","Data":"5f06e5e3ea06f2653efc92a39f77fc2530e8874e9db7a29ff1270df604ef963d"} Jan 31 04:47:21 crc kubenswrapper[4812]: I0131 04:47:21.196180 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-log" containerID="cri-o://4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc" gracePeriod=30 Jan 31 04:47:21 crc kubenswrapper[4812]: I0131 04:47:21.196208 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-httpd" containerID="cri-o://067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b" gracePeriod=30 Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.213336 4812 generic.go:334] "Generic (PLEG): container finished" podID="05459308-85cb-472c-a0e5-00843e8070bd" containerID="4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc" exitCode=143 Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.213485 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"05459308-85cb-472c-a0e5-00843e8070bd","Type":"ContainerDied","Data":"4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc"} Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.550018 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.684287 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-operator-scripts\") pod \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.684748 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4sm9v\" (UniqueName: \"kubernetes.io/projected/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-kube-api-access-4sm9v\") pod \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\" (UID: \"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1\") " Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.685471 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" (UID: "17b638cc-e751-4b9f-b7d8-3eb4133cd8c1"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.690792 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-kube-api-access-4sm9v" (OuterVolumeSpecName: "kube-api-access-4sm9v") pod "17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" (UID: "17b638cc-e751-4b9f-b7d8-3eb4133cd8c1"). InnerVolumeSpecName "kube-api-access-4sm9v". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.786060 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4sm9v\" (UniqueName: \"kubernetes.io/projected/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-kube-api-access-4sm9v\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:22 crc kubenswrapper[4812]: I0131 04:47:22.786089 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:23 crc kubenswrapper[4812]: I0131 04:47:23.234671 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" event={"ID":"17b638cc-e751-4b9f-b7d8-3eb4133cd8c1","Type":"ContainerDied","Data":"5f06e5e3ea06f2653efc92a39f77fc2530e8874e9db7a29ff1270df604ef963d"} Jan 31 04:47:23 crc kubenswrapper[4812]: I0131 04:47:23.234734 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="5f06e5e3ea06f2653efc92a39f77fc2530e8874e9db7a29ff1270df604ef963d" Jan 31 04:47:23 crc kubenswrapper[4812]: I0131 04:47:23.234820 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance9ae3-account-delete-tx6xt" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.774818 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.818708 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-logs\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.818792 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-public-tls-certs\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.818867 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-config-data\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.818948 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-httpd-run\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.818973 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-internal-tls-certs\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819011 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819067 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-scripts\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819112 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7wgms\" (UniqueName: \"kubernetes.io/projected/05459308-85cb-472c-a0e5-00843e8070bd-kube-api-access-7wgms\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819139 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-combined-ca-bundle\") pod \"05459308-85cb-472c-a0e5-00843e8070bd\" (UID: \"05459308-85cb-472c-a0e5-00843e8070bd\") " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819347 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819505 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-logs" (OuterVolumeSpecName: "logs") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.819605 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.826226 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/05459308-85cb-472c-a0e5-00843e8070bd-kube-api-access-7wgms" (OuterVolumeSpecName: "kube-api-access-7wgms") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "kube-api-access-7wgms". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.826226 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.835424 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-scripts" (OuterVolumeSpecName: "scripts") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.846076 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-combined-ca-bundle" (OuterVolumeSpecName: "combined-ca-bundle") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "combined-ca-bundle". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.880492 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-public-tls-certs" (OuterVolumeSpecName: "public-tls-certs") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "public-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.883770 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-config-data" (OuterVolumeSpecName: "config-data") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.887333 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-internal-tls-certs" (OuterVolumeSpecName: "internal-tls-certs") pod "05459308-85cb-472c-a0e5-00843e8070bd" (UID: "05459308-85cb-472c-a0e5-00843e8070bd"). InnerVolumeSpecName "internal-tls-certs". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921518 4812 reconciler_common.go:293] "Volume detached for volume \"internal-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-internal-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921594 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921608 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921620 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7wgms\" (UniqueName: \"kubernetes.io/projected/05459308-85cb-472c-a0e5-00843e8070bd-kube-api-access-7wgms\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921635 4812 reconciler_common.go:293] "Volume detached for volume \"combined-ca-bundle\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-combined-ca-bundle\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921646 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/05459308-85cb-472c-a0e5-00843e8070bd-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921668 4812 reconciler_common.go:293] "Volume detached for volume \"public-tls-certs\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-public-tls-certs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.921679 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/05459308-85cb-472c-a0e5-00843e8070bd-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.945491 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.965886 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-bzpzc"] Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.978118 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-bzpzc"] Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.985048 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance9ae3-account-delete-tx6xt"] Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.990851 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2"] Jan 31 04:47:24 crc kubenswrapper[4812]: I0131 04:47:24.995826 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-9ae3-account-create-update-f8wh2"] Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.000257 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance9ae3-account-delete-tx6xt"] Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.022973 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.272493 4812 generic.go:334] "Generic (PLEG): container finished" podID="05459308-85cb-472c-a0e5-00843e8070bd" containerID="067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b" exitCode=0 Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.272554 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"05459308-85cb-472c-a0e5-00843e8070bd","Type":"ContainerDied","Data":"067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b"} Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.272617 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"05459308-85cb-472c-a0e5-00843e8070bd","Type":"ContainerDied","Data":"3c40a33a46d1e97d298b0eba01f9528230e8f1ff7d27e2b92d402df9b6617e3c"} Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.272635 4812 scope.go:117] "RemoveContainer" containerID="067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.273209 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.306764 4812 scope.go:117] "RemoveContainer" containerID="4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.327061 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.330931 4812 scope.go:117] "RemoveContainer" containerID="067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b" Jan 31 04:47:25 crc kubenswrapper[4812]: E0131 04:47:25.331458 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b\": container with ID starting with 067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b not found: ID does not exist" containerID="067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.331513 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b"} err="failed to get container status \"067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b\": rpc error: code = NotFound desc = could not find container \"067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b\": container with ID starting with 067e3e41e1d2455e1f1971e8a7d3aac0acb84f3f9fefc3cc76e40c7adca1f65b not found: ID does not exist" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.331549 4812 scope.go:117] "RemoveContainer" containerID="4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc" Jan 31 04:47:25 crc kubenswrapper[4812]: E0131 04:47:25.331925 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc\": container with ID starting with 4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc not found: ID does not exist" containerID="4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.331998 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc"} err="failed to get container status \"4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc\": rpc error: code = NotFound desc = could not find container \"4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc\": container with ID starting with 4b9c51d40e050940c25ca6e1aa42a82a58c4d9d837a4db1abab245df6b308bdc not found: ID does not exist" Jan 31 04:47:25 crc kubenswrapper[4812]: I0131 04:47:25.348089 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.349169 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="05459308-85cb-472c-a0e5-00843e8070bd" path="/var/lib/kubelet/pods/05459308-85cb-472c-a0e5-00843e8070bd/volumes" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.350256 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" path="/var/lib/kubelet/pods/17b638cc-e751-4b9f-b7d8-3eb4133cd8c1/volumes" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.350752 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7c990aff-f3f1-480e-a40d-2426778a990b" path="/var/lib/kubelet/pods/7c990aff-f3f1-480e-a40d-2426778a990b/volumes" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.351897 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9e7e7548-7925-49f0-99ce-000ee043788a" path="/var/lib/kubelet/pods/9e7e7548-7925-49f0-99ce-000ee043788a/volumes" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352314 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-z5xnk"] Jan 31 04:47:26 crc kubenswrapper[4812]: E0131 04:47:26.352530 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-httpd" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352544 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-httpd" Jan 31 04:47:26 crc kubenswrapper[4812]: E0131 04:47:26.352557 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-log" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352563 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-log" Jan 31 04:47:26 crc kubenswrapper[4812]: E0131 04:47:26.352572 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" containerName="mariadb-account-delete" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352578 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" containerName="mariadb-account-delete" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352700 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-log" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352710 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="17b638cc-e751-4b9f-b7d8-3eb4133cd8c1" containerName="mariadb-account-delete" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.352721 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="05459308-85cb-472c-a0e5-00843e8070bd" containerName="glance-httpd" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.353173 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.359755 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-3e69-account-create-update-8nv9x"] Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.360637 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.362477 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.370466 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-z5xnk"] Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.377770 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3e69-account-create-update-8nv9x"] Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.444395 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-operator-scripts\") pod \"glance-3e69-account-create-update-8nv9x\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.444464 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fd73e1-c08b-4d86-992d-42da4bee71ec-operator-scripts\") pod \"glance-db-create-z5xnk\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.444498 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kz7p6\" (UniqueName: \"kubernetes.io/projected/d7fd73e1-c08b-4d86-992d-42da4bee71ec-kube-api-access-kz7p6\") pod \"glance-db-create-z5xnk\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.444543 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-27v8g\" (UniqueName: \"kubernetes.io/projected/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-kube-api-access-27v8g\") pod \"glance-3e69-account-create-update-8nv9x\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.545890 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-operator-scripts\") pod \"glance-3e69-account-create-update-8nv9x\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.546299 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fd73e1-c08b-4d86-992d-42da4bee71ec-operator-scripts\") pod \"glance-db-create-z5xnk\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.546503 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-kz7p6\" (UniqueName: \"kubernetes.io/projected/d7fd73e1-c08b-4d86-992d-42da4bee71ec-kube-api-access-kz7p6\") pod \"glance-db-create-z5xnk\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.546715 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-27v8g\" (UniqueName: \"kubernetes.io/projected/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-kube-api-access-27v8g\") pod \"glance-3e69-account-create-update-8nv9x\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.547084 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-operator-scripts\") pod \"glance-3e69-account-create-update-8nv9x\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.547305 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fd73e1-c08b-4d86-992d-42da4bee71ec-operator-scripts\") pod \"glance-db-create-z5xnk\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.568820 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-27v8g\" (UniqueName: \"kubernetes.io/projected/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-kube-api-access-27v8g\") pod \"glance-3e69-account-create-update-8nv9x\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.575292 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-kz7p6\" (UniqueName: \"kubernetes.io/projected/d7fd73e1-c08b-4d86-992d-42da4bee71ec-kube-api-access-kz7p6\") pod \"glance-db-create-z5xnk\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.670562 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:26 crc kubenswrapper[4812]: I0131 04:47:26.680047 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:27 crc kubenswrapper[4812]: I0131 04:47:27.115262 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-z5xnk"] Jan 31 04:47:27 crc kubenswrapper[4812]: I0131 04:47:27.171514 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-3e69-account-create-update-8nv9x"] Jan 31 04:47:27 crc kubenswrapper[4812]: W0131 04:47:27.182138 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod4d7a410c_5f45_43ba_8f11_f3d5ab50b2c4.slice/crio-b66957bfd3914985ec3c409fbddc2b491df25c93c6e627626a3beae6e6168041 WatchSource:0}: Error finding container b66957bfd3914985ec3c409fbddc2b491df25c93c6e627626a3beae6e6168041: Status 404 returned error can't find the container with id b66957bfd3914985ec3c409fbddc2b491df25c93c6e627626a3beae6e6168041 Jan 31 04:47:27 crc kubenswrapper[4812]: I0131 04:47:27.287755 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-z5xnk" event={"ID":"d7fd73e1-c08b-4d86-992d-42da4bee71ec","Type":"ContainerStarted","Data":"49b1e93bf2acdcdd706d6f71285f0a12403d9b27c51b49a65e0e4c82ed7c7f74"} Jan 31 04:47:27 crc kubenswrapper[4812]: I0131 04:47:27.289314 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" event={"ID":"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4","Type":"ContainerStarted","Data":"b66957bfd3914985ec3c409fbddc2b491df25c93c6e627626a3beae6e6168041"} Jan 31 04:47:28 crc kubenswrapper[4812]: I0131 04:47:28.301458 4812 generic.go:334] "Generic (PLEG): container finished" podID="4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" containerID="e51ab9b26999f0955f93725960efb8199b1fee513f942f95303d4655c737afac" exitCode=0 Jan 31 04:47:28 crc kubenswrapper[4812]: I0131 04:47:28.301527 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" event={"ID":"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4","Type":"ContainerDied","Data":"e51ab9b26999f0955f93725960efb8199b1fee513f942f95303d4655c737afac"} Jan 31 04:47:28 crc kubenswrapper[4812]: I0131 04:47:28.303882 4812 generic.go:334] "Generic (PLEG): container finished" podID="d7fd73e1-c08b-4d86-992d-42da4bee71ec" containerID="c4a6184a03c27e1e3532aad1acb22e775b25f84064b83caa079ffd87496ab4a5" exitCode=0 Jan 31 04:47:28 crc kubenswrapper[4812]: I0131 04:47:28.303963 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-z5xnk" event={"ID":"d7fd73e1-c08b-4d86-992d-42da4bee71ec","Type":"ContainerDied","Data":"c4a6184a03c27e1e3532aad1acb22e775b25f84064b83caa079ffd87496ab4a5"} Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.682741 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.688932 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.799606 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-27v8g\" (UniqueName: \"kubernetes.io/projected/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-kube-api-access-27v8g\") pod \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.799666 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kz7p6\" (UniqueName: \"kubernetes.io/projected/d7fd73e1-c08b-4d86-992d-42da4bee71ec-kube-api-access-kz7p6\") pod \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.799736 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-operator-scripts\") pod \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\" (UID: \"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4\") " Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.799822 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fd73e1-c08b-4d86-992d-42da4bee71ec-operator-scripts\") pod \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\" (UID: \"d7fd73e1-c08b-4d86-992d-42da4bee71ec\") " Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.800667 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d7fd73e1-c08b-4d86-992d-42da4bee71ec-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "d7fd73e1-c08b-4d86-992d-42da4bee71ec" (UID: "d7fd73e1-c08b-4d86-992d-42da4bee71ec"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.802552 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" (UID: "4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.806046 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d7fd73e1-c08b-4d86-992d-42da4bee71ec-kube-api-access-kz7p6" (OuterVolumeSpecName: "kube-api-access-kz7p6") pod "d7fd73e1-c08b-4d86-992d-42da4bee71ec" (UID: "d7fd73e1-c08b-4d86-992d-42da4bee71ec"). InnerVolumeSpecName "kube-api-access-kz7p6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.808066 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-kube-api-access-27v8g" (OuterVolumeSpecName: "kube-api-access-27v8g") pod "4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" (UID: "4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4"). InnerVolumeSpecName "kube-api-access-27v8g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.901343 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/d7fd73e1-c08b-4d86-992d-42da4bee71ec-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.901386 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-27v8g\" (UniqueName: \"kubernetes.io/projected/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-kube-api-access-27v8g\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.901405 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-kz7p6\" (UniqueName: \"kubernetes.io/projected/d7fd73e1-c08b-4d86-992d-42da4bee71ec-kube-api-access-kz7p6\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:29 crc kubenswrapper[4812]: I0131 04:47:29.901422 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:30 crc kubenswrapper[4812]: I0131 04:47:30.326736 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-z5xnk" Jan 31 04:47:30 crc kubenswrapper[4812]: I0131 04:47:30.326744 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-z5xnk" event={"ID":"d7fd73e1-c08b-4d86-992d-42da4bee71ec","Type":"ContainerDied","Data":"49b1e93bf2acdcdd706d6f71285f0a12403d9b27c51b49a65e0e4c82ed7c7f74"} Jan 31 04:47:30 crc kubenswrapper[4812]: I0131 04:47:30.326870 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="49b1e93bf2acdcdd706d6f71285f0a12403d9b27c51b49a65e0e4c82ed7c7f74" Jan 31 04:47:30 crc kubenswrapper[4812]: I0131 04:47:30.328643 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" event={"ID":"4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4","Type":"ContainerDied","Data":"b66957bfd3914985ec3c409fbddc2b491df25c93c6e627626a3beae6e6168041"} Jan 31 04:47:30 crc kubenswrapper[4812]: I0131 04:47:30.328686 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b66957bfd3914985ec3c409fbddc2b491df25c93c6e627626a3beae6e6168041" Jan 31 04:47:30 crc kubenswrapper[4812]: I0131 04:47:30.328746 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-3e69-account-create-update-8nv9x" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.572082 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-rns6p"] Jan 31 04:47:31 crc kubenswrapper[4812]: E0131 04:47:31.572604 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" containerName="mariadb-account-create-update" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.572616 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" containerName="mariadb-account-create-update" Jan 31 04:47:31 crc kubenswrapper[4812]: E0131 04:47:31.572624 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d7fd73e1-c08b-4d86-992d-42da4bee71ec" containerName="mariadb-database-create" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.572630 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d7fd73e1-c08b-4d86-992d-42da4bee71ec" containerName="mariadb-database-create" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.572754 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" containerName="mariadb-account-create-update" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.572769 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d7fd73e1-c08b-4d86-992d-42da4bee71ec" containerName="mariadb-database-create" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.573253 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.575245 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-97c6k" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.587897 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rns6p"] Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.588147 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.637082 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5zj9p\" (UniqueName: \"kubernetes.io/projected/1f05ab78-8195-45b9-8370-fd54e6ef1e75-kube-api-access-5zj9p\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.637223 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-config-data\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.637268 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-db-sync-config-data\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.738857 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5zj9p\" (UniqueName: \"kubernetes.io/projected/1f05ab78-8195-45b9-8370-fd54e6ef1e75-kube-api-access-5zj9p\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.738957 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-config-data\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.738985 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-db-sync-config-data\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.743885 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-db-sync-config-data\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.753733 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-config-data\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.757174 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5zj9p\" (UniqueName: \"kubernetes.io/projected/1f05ab78-8195-45b9-8370-fd54e6ef1e75-kube-api-access-5zj9p\") pod \"glance-db-sync-rns6p\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:31 crc kubenswrapper[4812]: I0131 04:47:31.901275 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:32 crc kubenswrapper[4812]: I0131 04:47:32.373217 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rns6p"] Jan 31 04:47:33 crc kubenswrapper[4812]: I0131 04:47:33.357527 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rns6p" event={"ID":"1f05ab78-8195-45b9-8370-fd54e6ef1e75","Type":"ContainerStarted","Data":"64ef56966501bb3212494535586a8d5491def695b937c425a1a878459582fdc7"} Jan 31 04:47:33 crc kubenswrapper[4812]: I0131 04:47:33.357815 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rns6p" event={"ID":"1f05ab78-8195-45b9-8370-fd54e6ef1e75","Type":"ContainerStarted","Data":"264b3d3218f26e09ee15d18cb47ddf66206583d6ffbc0f6f1ac15cdee0133096"} Jan 31 04:47:36 crc kubenswrapper[4812]: I0131 04:47:36.391104 4812 generic.go:334] "Generic (PLEG): container finished" podID="1f05ab78-8195-45b9-8370-fd54e6ef1e75" containerID="64ef56966501bb3212494535586a8d5491def695b937c425a1a878459582fdc7" exitCode=0 Jan 31 04:47:36 crc kubenswrapper[4812]: I0131 04:47:36.391194 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rns6p" event={"ID":"1f05ab78-8195-45b9-8370-fd54e6ef1e75","Type":"ContainerDied","Data":"64ef56966501bb3212494535586a8d5491def695b937c425a1a878459582fdc7"} Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.736565 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.830628 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5zj9p\" (UniqueName: \"kubernetes.io/projected/1f05ab78-8195-45b9-8370-fd54e6ef1e75-kube-api-access-5zj9p\") pod \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.830711 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-db-sync-config-data\") pod \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.830803 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-config-data\") pod \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\" (UID: \"1f05ab78-8195-45b9-8370-fd54e6ef1e75\") " Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.837212 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f05ab78-8195-45b9-8370-fd54e6ef1e75-kube-api-access-5zj9p" (OuterVolumeSpecName: "kube-api-access-5zj9p") pod "1f05ab78-8195-45b9-8370-fd54e6ef1e75" (UID: "1f05ab78-8195-45b9-8370-fd54e6ef1e75"). InnerVolumeSpecName "kube-api-access-5zj9p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.838212 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "1f05ab78-8195-45b9-8370-fd54e6ef1e75" (UID: "1f05ab78-8195-45b9-8370-fd54e6ef1e75"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.873441 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-config-data" (OuterVolumeSpecName: "config-data") pod "1f05ab78-8195-45b9-8370-fd54e6ef1e75" (UID: "1f05ab78-8195-45b9-8370-fd54e6ef1e75"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.933520 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.933564 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5zj9p\" (UniqueName: \"kubernetes.io/projected/1f05ab78-8195-45b9-8370-fd54e6ef1e75-kube-api-access-5zj9p\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:37 crc kubenswrapper[4812]: I0131 04:47:37.933585 4812 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/1f05ab78-8195-45b9-8370-fd54e6ef1e75-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:38 crc kubenswrapper[4812]: I0131 04:47:38.409875 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-rns6p" event={"ID":"1f05ab78-8195-45b9-8370-fd54e6ef1e75","Type":"ContainerDied","Data":"264b3d3218f26e09ee15d18cb47ddf66206583d6ffbc0f6f1ac15cdee0133096"} Jan 31 04:47:38 crc kubenswrapper[4812]: I0131 04:47:38.410330 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="264b3d3218f26e09ee15d18cb47ddf66206583d6ffbc0f6f1ac15cdee0133096" Jan 31 04:47:38 crc kubenswrapper[4812]: I0131 04:47:38.409943 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-rns6p" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.377481 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:47:39 crc kubenswrapper[4812]: E0131 04:47:39.377755 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1f05ab78-8195-45b9-8370-fd54e6ef1e75" containerName="glance-db-sync" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.377768 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f05ab78-8195-45b9-8370-fd54e6ef1e75" containerName="glance-db-sync" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.378098 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1f05ab78-8195-45b9-8370-fd54e6ef1e75" containerName="glance-db-sync" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.378971 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.384233 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-97c6k" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.384422 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.384596 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.414794 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.456497 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.456740 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.456829 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.456970 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457053 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457126 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-sys\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457182 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457241 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457336 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fvgtq\" (UniqueName: \"kubernetes.io/projected/9a4530e6-7d8e-4821-b55e-3489281c477d-kube-api-access-fvgtq\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457418 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-run\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457495 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457578 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-dev\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457680 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-logs\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.457765 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.558824 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559077 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559166 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559266 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559355 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559511 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-sys\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559596 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559679 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559794 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-fvgtq\" (UniqueName: \"kubernetes.io/projected/9a4530e6-7d8e-4821-b55e-3489281c477d-kube-api-access-fvgtq\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559896 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.559983 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-run\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.560071 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-dev\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.560168 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.560244 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-logs\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.560811 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-logs\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561688 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561745 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561775 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561808 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-sys\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561870 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561894 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-dev\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.561917 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-run\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.562096 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.562189 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.562352 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.566034 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-config-data\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.572503 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-scripts\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.585957 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.586545 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-fvgtq\" (UniqueName: \"kubernetes.io/projected/9a4530e6-7d8e-4821-b55e-3489281c477d-kube-api-access-fvgtq\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.586801 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-external-api-0\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:39 crc kubenswrapper[4812]: I0131 04:47:39.696772 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.032453 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.034356 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.037812 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.048820 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067554 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067599 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067623 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-dev\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067643 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-run\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067690 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067832 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067961 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.067984 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.068053 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.068081 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.068107 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-sys\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.068144 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.068163 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.068215 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swckn\" (UniqueName: \"kubernetes.io/projected/b7a9418c-7312-4d06-a5cf-c37c06cdc688-kube-api-access-swckn\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.155025 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170245 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-dev\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170315 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-run\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170365 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170401 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-dev\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170402 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170492 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-run\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170724 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170756 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170818 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170854 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170877 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-sys\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170919 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170938 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.170992 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-swckn\" (UniqueName: \"kubernetes.io/projected/b7a9418c-7312-4d06-a5cf-c37c06cdc688-kube-api-access-swckn\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171068 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171111 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171184 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171230 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171305 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171422 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-sys\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171567 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.171854 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.172352 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.172696 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-logs\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.172778 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.181911 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-config-data\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.183238 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-scripts\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.193080 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-swckn\" (UniqueName: \"kubernetes.io/projected/b7a9418c-7312-4d06-a5cf-c37c06cdc688-kube-api-access-swckn\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.195393 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.206662 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.348524 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.445709 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.453348 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerStarted","Data":"88e1bcad1eaf5302747b4e15a51e0f87ec3f612549475615a8b8074643e26cd5"} Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.453384 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerStarted","Data":"24bb03fb5bf6f9e34d428536f734d4efb64cb6e12d732c97d30a3c58f7ef3764"} Jan 31 04:47:40 crc kubenswrapper[4812]: I0131 04:47:40.802590 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:40 crc kubenswrapper[4812]: W0131 04:47:40.803019 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podb7a9418c_7312_4d06_a5cf_c37c06cdc688.slice/crio-00fa2404d138b7d5f6c3a87ac5431f38ffea96709ab7a68810aeb2fcedbf3762 WatchSource:0}: Error finding container 00fa2404d138b7d5f6c3a87ac5431f38ffea96709ab7a68810aeb2fcedbf3762: Status 404 returned error can't find the container with id 00fa2404d138b7d5f6c3a87ac5431f38ffea96709ab7a68810aeb2fcedbf3762 Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.462500 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerStarted","Data":"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e"} Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.464100 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerStarted","Data":"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678"} Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.464239 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerStarted","Data":"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f"} Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.464353 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerStarted","Data":"00fa2404d138b7d5f6c3a87ac5431f38ffea96709ab7a68810aeb2fcedbf3762"} Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.462659 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-log" containerID="cri-o://cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" gracePeriod=30 Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.462748 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-api" containerID="cri-o://5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" gracePeriod=30 Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.462784 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-httpd" containerID="cri-o://72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" gracePeriod=30 Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.468376 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerStarted","Data":"a3e32a28184e365c87ed6478bc1f2546b5682e97e871939b919694391d1da710"} Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.468808 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerStarted","Data":"5e30560af9499ca05fdc3aa7d51047b5836a053b55102c0a2d3274593dba2818"} Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.505645 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.505624209 podStartE2EDuration="3.505624209s" podCreationTimestamp="2026-01-31 04:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:41.495143648 +0000 UTC m=+1269.990165373" watchObservedRunningTime="2026-01-31 04:47:41.505624209 +0000 UTC m=+1270.000645905" Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.526264 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.526244973 podStartE2EDuration="2.526244973s" podCreationTimestamp="2026-01-31 04:47:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:41.519131061 +0000 UTC m=+1270.014152746" watchObservedRunningTime="2026-01-31 04:47:41.526244973 +0000 UTC m=+1270.021266638" Jan 31 04:47:41 crc kubenswrapper[4812]: I0131 04:47:41.874452 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007453 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-run\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007496 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-var-locks-brick\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007518 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-nvme\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007561 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007583 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-dev\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007583 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-run" (OuterVolumeSpecName: "run") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007622 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-logs\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007645 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007649 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-lib-modules\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007699 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007703 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-sys\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007787 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-config-data\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007809 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-scripts\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007847 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-iscsi\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007875 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-swckn\" (UniqueName: \"kubernetes.io/projected/b7a9418c-7312-4d06-a5cf-c37c06cdc688-kube-api-access-swckn\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007894 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007936 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-httpd-run\") pod \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\" (UID: \"b7a9418c-7312-4d06-a5cf-c37c06cdc688\") " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.008236 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.008643 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.008656 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007719 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-sys" (OuterVolumeSpecName: "sys") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007734 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.008701 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.007755 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-dev" (OuterVolumeSpecName: "dev") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.008298 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.008591 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-logs" (OuterVolumeSpecName: "logs") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.014973 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.017144 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b7a9418c-7312-4d06-a5cf-c37c06cdc688-kube-api-access-swckn" (OuterVolumeSpecName: "kube-api-access-swckn") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "kube-api-access-swckn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.017856 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-scripts" (OuterVolumeSpecName: "scripts") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.018462 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.094577 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-config-data" (OuterVolumeSpecName: "config-data") pod "b7a9418c-7312-4d06-a5cf-c37c06cdc688" (UID: "b7a9418c-7312-4d06-a5cf-c37c06cdc688"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110462 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110494 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110504 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110515 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b7a9418c-7312-4d06-a5cf-c37c06cdc688-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110522 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110532 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-swckn\" (UniqueName: \"kubernetes.io/projected/b7a9418c-7312-4d06-a5cf-c37c06cdc688-kube-api-access-swckn\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110568 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110578 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b7a9418c-7312-4d06-a5cf-c37c06cdc688-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110587 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110606 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.110614 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b7a9418c-7312-4d06-a5cf-c37c06cdc688-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.122407 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.123824 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.212637 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.212688 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.480784 4812 generic.go:334] "Generic (PLEG): container finished" podID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerID="5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" exitCode=143 Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.480821 4812 generic.go:334] "Generic (PLEG): container finished" podID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerID="72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" exitCode=143 Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.480914 4812 generic.go:334] "Generic (PLEG): container finished" podID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerID="cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" exitCode=143 Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.480877 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerDied","Data":"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e"} Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.481007 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerDied","Data":"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678"} Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.481027 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerDied","Data":"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f"} Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.481039 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"b7a9418c-7312-4d06-a5cf-c37c06cdc688","Type":"ContainerDied","Data":"00fa2404d138b7d5f6c3a87ac5431f38ffea96709ab7a68810aeb2fcedbf3762"} Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.481058 4812 scope.go:117] "RemoveContainer" containerID="5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.483537 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.518636 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.525490 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.550430 4812 scope.go:117] "RemoveContainer" containerID="72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.575689 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:42 crc kubenswrapper[4812]: E0131 04:47:42.576035 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-log" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.576055 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-log" Jan 31 04:47:42 crc kubenswrapper[4812]: E0131 04:47:42.576084 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-api" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.576094 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-api" Jan 31 04:47:42 crc kubenswrapper[4812]: E0131 04:47:42.576109 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-httpd" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.576118 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-httpd" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.576274 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-httpd" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.576290 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-log" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.576305 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" containerName="glance-api" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.577568 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.578348 4812 scope.go:117] "RemoveContainer" containerID="cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.582437 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.606359 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.634223 4812 scope.go:117] "RemoveContainer" containerID="5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" Jan 31 04:47:42 crc kubenswrapper[4812]: E0131 04:47:42.634593 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": container with ID starting with 5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e not found: ID does not exist" containerID="5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.634619 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e"} err="failed to get container status \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": rpc error: code = NotFound desc = could not find container \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": container with ID starting with 5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.634639 4812 scope.go:117] "RemoveContainer" containerID="72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" Jan 31 04:47:42 crc kubenswrapper[4812]: E0131 04:47:42.635034 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": container with ID starting with 72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678 not found: ID does not exist" containerID="72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.635055 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678"} err="failed to get container status \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": rpc error: code = NotFound desc = could not find container \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": container with ID starting with 72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678 not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.635070 4812 scope.go:117] "RemoveContainer" containerID="cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" Jan 31 04:47:42 crc kubenswrapper[4812]: E0131 04:47:42.635401 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": container with ID starting with cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f not found: ID does not exist" containerID="cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.635453 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f"} err="failed to get container status \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": rpc error: code = NotFound desc = could not find container \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": container with ID starting with cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.635489 4812 scope.go:117] "RemoveContainer" containerID="5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.636126 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e"} err="failed to get container status \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": rpc error: code = NotFound desc = could not find container \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": container with ID starting with 5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.636159 4812 scope.go:117] "RemoveContainer" containerID="72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.636801 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678"} err="failed to get container status \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": rpc error: code = NotFound desc = could not find container \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": container with ID starting with 72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678 not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.636833 4812 scope.go:117] "RemoveContainer" containerID="cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.637199 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f"} err="failed to get container status \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": rpc error: code = NotFound desc = could not find container \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": container with ID starting with cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.637234 4812 scope.go:117] "RemoveContainer" containerID="5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.637758 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e"} err="failed to get container status \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": rpc error: code = NotFound desc = could not find container \"5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e\": container with ID starting with 5170df3d09ccc3cd2208bc54b437c1692e40f171a25d84145a324c305c60169e not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.637790 4812 scope.go:117] "RemoveContainer" containerID="72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.638655 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678"} err="failed to get container status \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": rpc error: code = NotFound desc = could not find container \"72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678\": container with ID starting with 72ddca8f7a76225b06ebdff82fc8622ca548be511791ee5fdb4c92843d695678 not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.638686 4812 scope.go:117] "RemoveContainer" containerID="cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.639025 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f"} err="failed to get container status \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": rpc error: code = NotFound desc = could not find container \"cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f\": container with ID starting with cb2777948d11a3ef7cc631c2e23fb2b308f567b153613f967af54ab66d3a2a7f not found: ID does not exist" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.726747 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9g4w6\" (UniqueName: \"kubernetes.io/projected/c849d85a-5e07-42e8-98f3-506b9e711ae7-kube-api-access-9g4w6\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.726823 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.726977 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.727001 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.727030 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.727050 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-sys\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.727070 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.728829 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.728914 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.728947 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.728973 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.729026 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-dev\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.729054 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-run\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.729125 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830436 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830481 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830505 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830521 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-sys\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830538 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830563 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830588 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830610 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830628 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830656 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-dev\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830672 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-run\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830689 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830720 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9g4w6\" (UniqueName: \"kubernetes.io/projected/c849d85a-5e07-42e8-98f3-506b9e711ae7-kube-api-access-9g4w6\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830744 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830803 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830865 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.830930 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831340 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-run\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831529 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-dev\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831631 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-logs\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831670 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-sys\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831722 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831995 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.832018 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.831901 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.836183 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.847761 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.851537 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.856784 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.859468 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9g4w6\" (UniqueName: \"kubernetes.io/projected/c849d85a-5e07-42e8-98f3-506b9e711ae7-kube-api-access-9g4w6\") pod \"glance-default-internal-api-0\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:42 crc kubenswrapper[4812]: I0131 04:47:42.932470 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:43 crc kubenswrapper[4812]: I0131 04:47:43.384166 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:47:43 crc kubenswrapper[4812]: W0131 04:47:43.386244 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc849d85a_5e07_42e8_98f3_506b9e711ae7.slice/crio-d62f0c0f9230bf98391a5dd08d9766d994a47be992898fc450a34f9b4495d1e0 WatchSource:0}: Error finding container d62f0c0f9230bf98391a5dd08d9766d994a47be992898fc450a34f9b4495d1e0: Status 404 returned error can't find the container with id d62f0c0f9230bf98391a5dd08d9766d994a47be992898fc450a34f9b4495d1e0 Jan 31 04:47:43 crc kubenswrapper[4812]: I0131 04:47:43.488753 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerStarted","Data":"d62f0c0f9230bf98391a5dd08d9766d994a47be992898fc450a34f9b4495d1e0"} Jan 31 04:47:44 crc kubenswrapper[4812]: I0131 04:47:44.351155 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b7a9418c-7312-4d06-a5cf-c37c06cdc688" path="/var/lib/kubelet/pods/b7a9418c-7312-4d06-a5cf-c37c06cdc688/volumes" Jan 31 04:47:44 crc kubenswrapper[4812]: I0131 04:47:44.508129 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerStarted","Data":"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78"} Jan 31 04:47:44 crc kubenswrapper[4812]: I0131 04:47:44.508191 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerStarted","Data":"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6"} Jan 31 04:47:44 crc kubenswrapper[4812]: I0131 04:47:44.508213 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerStarted","Data":"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526"} Jan 31 04:47:44 crc kubenswrapper[4812]: I0131 04:47:44.551014 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.550990314 podStartE2EDuration="2.550990314s" podCreationTimestamp="2026-01-31 04:47:42 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:44.543351829 +0000 UTC m=+1273.038373504" watchObservedRunningTime="2026-01-31 04:47:44.550990314 +0000 UTC m=+1273.046012009" Jan 31 04:47:49 crc kubenswrapper[4812]: I0131 04:47:49.697937 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:49 crc kubenswrapper[4812]: I0131 04:47:49.698587 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:49 crc kubenswrapper[4812]: I0131 04:47:49.698617 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:49 crc kubenswrapper[4812]: I0131 04:47:49.740344 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:49 crc kubenswrapper[4812]: I0131 04:47:49.742294 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:49 crc kubenswrapper[4812]: I0131 04:47:49.775911 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:50 crc kubenswrapper[4812]: I0131 04:47:50.573336 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:50 crc kubenswrapper[4812]: I0131 04:47:50.573615 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:50 crc kubenswrapper[4812]: I0131 04:47:50.573679 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:50 crc kubenswrapper[4812]: I0131 04:47:50.589656 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:50 crc kubenswrapper[4812]: I0131 04:47:50.590455 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:50 crc kubenswrapper[4812]: I0131 04:47:50.595657 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:47:52 crc kubenswrapper[4812]: I0131 04:47:52.933287 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:52 crc kubenswrapper[4812]: I0131 04:47:52.933696 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:52 crc kubenswrapper[4812]: I0131 04:47:52.933726 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:52 crc kubenswrapper[4812]: I0131 04:47:52.971385 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:52 crc kubenswrapper[4812]: I0131 04:47:52.971622 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:52 crc kubenswrapper[4812]: I0131 04:47:52.989995 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:53 crc kubenswrapper[4812]: I0131 04:47:53.599485 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:53 crc kubenswrapper[4812]: I0131 04:47:53.599934 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:53 crc kubenswrapper[4812]: I0131 04:47:53.599955 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:53 crc kubenswrapper[4812]: I0131 04:47:53.617712 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:53 crc kubenswrapper[4812]: I0131 04:47:53.623383 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:53 crc kubenswrapper[4812]: I0131 04:47:53.626816 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.496098 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.498344 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.504066 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.505722 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.537289 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.567440 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.597232 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.598798 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.610018 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.612607 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.648026 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.654829 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.658570 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.658630 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.658664 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-logs\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.658684 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-scripts\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.658704 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-dev\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662054 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-dev\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662102 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662127 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-config-data\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662168 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662197 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662215 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662232 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662501 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662561 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662580 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-sys\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662601 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-run\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662648 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-scripts\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662664 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662693 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662738 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662764 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrhv\" (UniqueName: \"kubernetes.io/projected/8db441b6-c1f4-442d-b274-e4a80d9340fa-kube-api-access-4jrhv\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662801 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-sys\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662828 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-run\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662856 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662899 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lw7k7\" (UniqueName: \"kubernetes.io/projected/532e2e2a-49ec-4141-ae9f-c61830fc352c-kube-api-access-lw7k7\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662928 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662943 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-config-data\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.662969 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-logs\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764513 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764571 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764598 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-sys\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764624 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764647 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764672 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-run\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764694 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764714 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764724 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764789 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764860 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-dev\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764896 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764920 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-sys\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764966 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.764986 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765026 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-run\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765060 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-run\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765089 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765093 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-sys\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765118 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765121 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-logs\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765428 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765490 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765544 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-scripts\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765563 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765591 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765611 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-dev\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765629 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765655 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765682 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765698 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-sys\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765715 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-logs\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765742 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765761 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765780 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4jrhv\" (UniqueName: \"kubernetes.io/projected/8db441b6-c1f4-442d-b274-e4a80d9340fa-kube-api-access-4jrhv\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765795 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765818 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8sv9k\" (UniqueName: \"kubernetes.io/projected/cba4939b-2319-41a5-8471-7b405315de18-kube-api-access-8sv9k\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765852 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765887 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-sys\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765907 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765928 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765951 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-run\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.765971 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766007 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-config-data\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766033 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lw7k7\" (UniqueName: \"kubernetes.io/projected/532e2e2a-49ec-4141-ae9f-c61830fc352c-kube-api-access-lw7k7\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766055 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-run\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766083 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766106 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766125 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-scripts\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766150 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-config-data\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766167 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-scripts\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766192 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766209 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-logs\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766249 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766270 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766292 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-config-data\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766326 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766343 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-logs\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766363 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-scripts\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766427 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-dev\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766503 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-dev\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766550 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766615 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bg2gc\" (UniqueName: \"kubernetes.io/projected/5c2c0959-3069-4fde-b59d-1c265497ccda-kube-api-access-bg2gc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.766655 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-config-data\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.767483 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-dev\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.767581 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-dev\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.767649 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-run\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.767950 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.768012 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.768261 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.768326 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.769040 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.769335 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.770129 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-sys\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.770261 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.771284 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.771594 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-logs\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.771924 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-logs\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.774875 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-config-data\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.774970 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-scripts\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.778098 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-config-data\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.781714 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-scripts\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.789526 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lw7k7\" (UniqueName: \"kubernetes.io/projected/532e2e2a-49ec-4141-ae9f-c61830fc352c-kube-api-access-lw7k7\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.790751 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jrhv\" (UniqueName: \"kubernetes.io/projected/8db441b6-c1f4-442d-b274-e4a80d9340fa-kube-api-access-4jrhv\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.796150 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.797269 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-1\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.801876 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.815233 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-2\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.833341 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.857815 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.867829 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-scripts\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.867922 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868019 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868057 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-config-data\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868134 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bg2gc\" (UniqueName: \"kubernetes.io/projected/5c2c0959-3069-4fde-b59d-1c265497ccda-kube-api-access-bg2gc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868181 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-sys\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868212 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-run\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868243 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868278 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868292 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868315 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-run\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.868272 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.871734 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.871879 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.872214 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-sys\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.872284 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-dev\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.872367 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-dev\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.872436 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-logs\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.872490 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.872559 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.873997 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874061 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874088 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-dev\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874162 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874223 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874245 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-sys\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874343 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874172 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-dev\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874372 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-logs\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874404 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874465 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874491 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8sv9k\" (UniqueName: \"kubernetes.io/projected/cba4939b-2319-41a5-8471-7b405315de18-kube-api-access-8sv9k\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874544 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874579 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874631 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874701 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-config-data\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874742 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-run\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874804 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874825 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-scripts\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.875044 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-logs\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874410 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-sys\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.875130 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874744 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-logs\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.875442 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.875502 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.874301 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.875594 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-run\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.875795 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.876498 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.876869 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.883736 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-config-data\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.889992 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bg2gc\" (UniqueName: \"kubernetes.io/projected/5c2c0959-3069-4fde-b59d-1c265497ccda-kube-api-access-bg2gc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.897403 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-config-data\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.901294 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-scripts\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.904700 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-scripts\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.905090 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8sv9k\" (UniqueName: \"kubernetes.io/projected/cba4939b-2319-41a5-8471-7b405315de18-kube-api-access-8sv9k\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.926412 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.939084 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.943253 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-2\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.945267 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-internal-api-1\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:55 crc kubenswrapper[4812]: I0131 04:47:55.948098 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.206900 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.237213 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:47:56 crc kubenswrapper[4812]: W0131 04:47:56.345285 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod532e2e2a_49ec_4141_ae9f_c61830fc352c.slice/crio-f5494712e09fd6ed2e81dccf980dd6196dd1ac963e60f4faefca99a8d8186511 WatchSource:0}: Error finding container f5494712e09fd6ed2e81dccf980dd6196dd1ac963e60f4faefca99a8d8186511: Status 404 returned error can't find the container with id f5494712e09fd6ed2e81dccf980dd6196dd1ac963e60f4faefca99a8d8186511 Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.357717 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.490418 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.623812 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerStarted","Data":"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.623937 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerStarted","Data":"969ea1be606d085762dce9c1ce34bc3a207eb834c874ba9b0322bbb4f6b8a0ee"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.627068 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerStarted","Data":"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.627111 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerStarted","Data":"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.627125 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerStarted","Data":"1c37d04bf035dfbe5151af0b81eaef75ae48c84386b8a4f5cc41fb9f9beaf139"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.635548 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerStarted","Data":"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.635599 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerStarted","Data":"f5494712e09fd6ed2e81dccf980dd6196dd1ac963e60f4faefca99a8d8186511"} Jan 31 04:47:56 crc kubenswrapper[4812]: I0131 04:47:56.656507 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.648528 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerStarted","Data":"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.649106 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerStarted","Data":"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.653595 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerStarted","Data":"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.653652 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerStarted","Data":"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.653671 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerStarted","Data":"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.653690 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerStarted","Data":"cd9510e33bce5882c597239c14f813cbd16947b06dfcc3863a392de01ad9eecd"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.658131 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerStarted","Data":"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.658166 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerStarted","Data":"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.662120 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerStarted","Data":"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0"} Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.687770 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.6877437410000002 podStartE2EDuration="3.687743741s" podCreationTimestamp="2026-01-31 04:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:57.681042281 +0000 UTC m=+1286.176063966" watchObservedRunningTime="2026-01-31 04:47:57.687743741 +0000 UTC m=+1286.182765446" Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.724534 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.724513389 podStartE2EDuration="3.724513389s" podCreationTimestamp="2026-01-31 04:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:57.721588939 +0000 UTC m=+1286.216610604" watchObservedRunningTime="2026-01-31 04:47:57.724513389 +0000 UTC m=+1286.219535054" Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.772095 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.772073914 podStartE2EDuration="3.772073914s" podCreationTimestamp="2026-01-31 04:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:57.761296106 +0000 UTC m=+1286.256317781" watchObservedRunningTime="2026-01-31 04:47:57.772073914 +0000 UTC m=+1286.267095579" Jan 31 04:47:57 crc kubenswrapper[4812]: I0131 04:47:57.808604 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.808583835 podStartE2EDuration="3.808583835s" podCreationTimestamp="2026-01-31 04:47:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:47:57.79278279 +0000 UTC m=+1286.287804475" watchObservedRunningTime="2026-01-31 04:47:57.808583835 +0000 UTC m=+1286.303605500" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.834055 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.835079 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.836924 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.858708 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.858783 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.858810 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.862232 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.882803 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.908117 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.909787 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.915004 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.921807 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.948338 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.948398 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.948416 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.978416 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:05 crc kubenswrapper[4812]: I0131 04:48:05.983498 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.011770 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.238479 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.238543 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.238557 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.266440 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.269927 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.287985 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.754287 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.754986 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755043 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755071 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755089 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755116 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755133 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755148 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755165 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755182 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755197 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.755211 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.767447 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.767986 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.769571 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.771182 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.771653 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.772335 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.772629 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.773703 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.774405 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.776635 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.778445 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:06 crc kubenswrapper[4812]: I0131 04:48:06.778677 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:08 crc kubenswrapper[4812]: I0131 04:48:08.216756 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:48:08 crc kubenswrapper[4812]: I0131 04:48:08.228479 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:48:08 crc kubenswrapper[4812]: I0131 04:48:08.383789 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:48:08 crc kubenswrapper[4812]: I0131 04:48:08.392633 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.782793 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-log" containerID="cri-o://2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.782955 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-httpd" containerID="cri-o://d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.782970 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-api" containerID="cri-o://130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783330 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-log" containerID="cri-o://932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783368 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-api" containerID="cri-o://aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783369 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-httpd" containerID="cri-o://34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783486 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-log" containerID="cri-o://34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783535 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-api" containerID="cri-o://f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783562 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-httpd" containerID="cri-o://e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783641 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-log" containerID="cri-o://7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783676 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-api" containerID="cri-o://bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" gracePeriod=30 Jan 31 04:48:09 crc kubenswrapper[4812]: I0131 04:48:09.783681 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-httpd" containerID="cri-o://d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" gracePeriod=30 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.496928 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.548146 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.559601 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.559670 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-run\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.559747 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-httpd-run\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.559828 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-run" (OuterVolumeSpecName: "run") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.559904 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.559974 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-scripts\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560028 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-iscsi\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560052 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-logs\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560076 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-lib-modules\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560124 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-var-locks-brick\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560161 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-nvme\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560164 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560186 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560245 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-dev\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560299 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lw7k7\" (UniqueName: \"kubernetes.io/projected/532e2e2a-49ec-4141-ae9f-c61830fc352c-kube-api-access-lw7k7\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560322 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-sys\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.560393 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-config-data\") pod \"532e2e2a-49ec-4141-ae9f-c61830fc352c\" (UID: \"532e2e2a-49ec-4141-ae9f-c61830fc352c\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.562085 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.562144 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.562157 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.566947 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.566981 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.567061 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.567239 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-logs" (OuterVolumeSpecName: "logs") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.567276 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-dev" (OuterVolumeSpecName: "dev") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.567296 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-sys" (OuterVolumeSpecName: "sys") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.568294 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.569450 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.571140 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-scripts" (OuterVolumeSpecName: "scripts") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.574241 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/532e2e2a-49ec-4141-ae9f-c61830fc352c-kube-api-access-lw7k7" (OuterVolumeSpecName: "kube-api-access-lw7k7") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "kube-api-access-lw7k7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.642495 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.663246 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-iscsi\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.663365 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-var-locks-brick\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.663396 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-nvme\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.663781 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.663893 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.664037 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665632 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665697 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-sys\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665718 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-iscsi\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665748 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-httpd-run\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665902 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-var-locks-brick\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665934 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-nvme\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665959 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-dev\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.665978 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-lib-modules\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666079 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-httpd-run\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666120 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-scripts\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666146 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-config-data\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666186 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-scripts\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666206 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-sys\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666229 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8sv9k\" (UniqueName: \"kubernetes.io/projected/cba4939b-2319-41a5-8471-7b405315de18-kube-api-access-8sv9k\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666304 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666344 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-lib-modules\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666364 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666401 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-logs\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666490 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666514 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-config-data\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666792 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-dev\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666863 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-run\") pod \"cba4939b-2319-41a5-8471-7b405315de18\" (UID: \"cba4939b-2319-41a5-8471-7b405315de18\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666887 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-run\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.666947 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bg2gc\" (UniqueName: \"kubernetes.io/projected/5c2c0959-3069-4fde-b59d-1c265497ccda-kube-api-access-bg2gc\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667055 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-logs\") pod \"5c2c0959-3069-4fde-b59d-1c265497ccda\" (UID: \"5c2c0959-3069-4fde-b59d-1c265497ccda\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667690 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667709 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667721 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667807 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lw7k7\" (UniqueName: \"kubernetes.io/projected/532e2e2a-49ec-4141-ae9f-c61830fc352c-kube-api-access-lw7k7\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667883 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667897 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667907 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667919 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667944 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667960 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667972 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667983 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/532e2e2a-49ec-4141-ae9f-c61830fc352c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.667994 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/532e2e2a-49ec-4141-ae9f-c61830fc352c-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.669178 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-sys" (OuterVolumeSpecName: "sys") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673347 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-sys" (OuterVolumeSpecName: "sys") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673414 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673665 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance-cache") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673719 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673745 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673770 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-dev" (OuterVolumeSpecName: "dev") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.673789 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.674204 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-dev" (OuterVolumeSpecName: "dev") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.674265 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.674476 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.674515 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.674695 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-run" (OuterVolumeSpecName: "run") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.674729 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-run" (OuterVolumeSpecName: "run") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.676515 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-logs" (OuterVolumeSpecName: "logs") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.677656 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-logs" (OuterVolumeSpecName: "logs") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.677728 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-scripts" (OuterVolumeSpecName: "scripts") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.678073 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/cba4939b-2319-41a5-8471-7b405315de18-kube-api-access-8sv9k" (OuterVolumeSpecName: "kube-api-access-8sv9k") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "kube-api-access-8sv9k". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.679008 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.680906 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5c2c0959-3069-4fde-b59d-1c265497ccda-kube-api-access-bg2gc" (OuterVolumeSpecName: "kube-api-access-bg2gc") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "kube-api-access-bg2gc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.682039 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.682258 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-scripts" (OuterVolumeSpecName: "scripts") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.700484 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.701002 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.702695 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-config-data" (OuterVolumeSpecName: "config-data") pod "532e2e2a-49ec-4141-ae9f-c61830fc352c" (UID: "532e2e2a-49ec-4141-ae9f-c61830fc352c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.704598 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.718627 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.768691 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-run\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.768775 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-scripts\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.768798 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-config-data\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.768969 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-logs\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.768991 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-nvme\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769019 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4jrhv\" (UniqueName: \"kubernetes.io/projected/8db441b6-c1f4-442d-b274-e4a80d9340fa-kube-api-access-4jrhv\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769049 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-iscsi\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769090 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-dev\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769105 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769119 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769139 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-var-locks-brick\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769158 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-sys\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769180 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-lib-modules\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769195 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-httpd-run\") pod \"8db441b6-c1f4-442d-b274-e4a80d9340fa\" (UID: \"8db441b6-c1f4-442d-b274-e4a80d9340fa\") " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769476 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769494 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769512 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769521 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769533 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769542 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769550 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769560 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769568 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bg2gc\" (UniqueName: \"kubernetes.io/projected/5c2c0959-3069-4fde-b59d-1c265497ccda-kube-api-access-bg2gc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769576 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769588 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769596 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769605 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769616 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/cba4939b-2319-41a5-8471-7b405315de18-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769641 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/532e2e2a-49ec-4141-ae9f-c61830fc352c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769652 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769660 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769668 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769676 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/cba4939b-2319-41a5-8471-7b405315de18-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769684 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/5c2c0959-3069-4fde-b59d-1c265497ccda-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769692 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769700 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769710 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769721 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5c2c0959-3069-4fde-b59d-1c265497ccda-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769734 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8sv9k\" (UniqueName: \"kubernetes.io/projected/cba4939b-2319-41a5-8471-7b405315de18-kube-api-access-8sv9k\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769756 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769909 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-dev" (OuterVolumeSpecName: "dev") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.769959 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-run" (OuterVolumeSpecName: "run") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.771919 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.772004 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.772034 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.772286 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-logs" (OuterVolumeSpecName: "logs") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.772321 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.772362 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-sys" (OuterVolumeSpecName: "sys") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.772782 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.773751 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-scripts" (OuterVolumeSpecName: "scripts") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.774447 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.775297 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.776131 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8db441b6-c1f4-442d-b274-e4a80d9340fa-kube-api-access-4jrhv" (OuterVolumeSpecName: "kube-api-access-4jrhv") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "kube-api-access-4jrhv". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.787405 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.788649 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.791931 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-config-data" (OuterVolumeSpecName: "config-data") pod "5c2c0959-3069-4fde-b59d-1c265497ccda" (UID: "5c2c0959-3069-4fde-b59d-1c265497ccda"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792456 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-config-data" (OuterVolumeSpecName: "config-data") pod "cba4939b-2319-41a5-8471-7b405315de18" (UID: "cba4939b-2319-41a5-8471-7b405315de18"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792460 4812 generic.go:334] "Generic (PLEG): container finished" podID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerID="bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792628 4812 generic.go:334] "Generic (PLEG): container finished" podID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerID="d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792514 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerDied","Data":"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792752 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerDied","Data":"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792774 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerDied","Data":"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792523 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792798 4812 scope.go:117] "RemoveContainer" containerID="bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.792707 4812 generic.go:334] "Generic (PLEG): container finished" podID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerID="7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" exitCode=143 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.793051 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"5c2c0959-3069-4fde-b59d-1c265497ccda","Type":"ContainerDied","Data":"969ea1be606d085762dce9c1ce34bc3a207eb834c874ba9b0322bbb4f6b8a0ee"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.794284 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796075 4812 generic.go:334] "Generic (PLEG): container finished" podID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerID="f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796101 4812 generic.go:334] "Generic (PLEG): container finished" podID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerID="e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796111 4812 generic.go:334] "Generic (PLEG): container finished" podID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerID="34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" exitCode=143 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796137 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796161 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerDied","Data":"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796190 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerDied","Data":"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796205 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerDied","Data":"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.796216 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"8db441b6-c1f4-442d-b274-e4a80d9340fa","Type":"ContainerDied","Data":"1c37d04bf035dfbe5151af0b81eaef75ae48c84386b8a4f5cc41fb9f9beaf139"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800097 4812 generic.go:334] "Generic (PLEG): container finished" podID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerID="130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800118 4812 generic.go:334] "Generic (PLEG): container finished" podID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerID="d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800127 4812 generic.go:334] "Generic (PLEG): container finished" podID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerID="2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" exitCode=143 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800162 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800174 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerDied","Data":"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800195 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerDied","Data":"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800205 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerDied","Data":"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.800214 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"532e2e2a-49ec-4141-ae9f-c61830fc352c","Type":"ContainerDied","Data":"f5494712e09fd6ed2e81dccf980dd6196dd1ac963e60f4faefca99a8d8186511"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.801184 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811583 4812 generic.go:334] "Generic (PLEG): container finished" podID="cba4939b-2319-41a5-8471-7b405315de18" containerID="aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811634 4812 generic.go:334] "Generic (PLEG): container finished" podID="cba4939b-2319-41a5-8471-7b405315de18" containerID="34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" exitCode=0 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811643 4812 generic.go:334] "Generic (PLEG): container finished" podID="cba4939b-2319-41a5-8471-7b405315de18" containerID="932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" exitCode=143 Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811661 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerDied","Data":"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811704 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerDied","Data":"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811716 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerDied","Data":"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811728 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"cba4939b-2319-41a5-8471-7b405315de18","Type":"ContainerDied","Data":"cd9510e33bce5882c597239c14f813cbd16947b06dfcc3863a392de01ad9eecd"} Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.811798 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.812965 4812 scope.go:117] "RemoveContainer" containerID="d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.834931 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.839010 4812 scope.go:117] "RemoveContainer" containerID="7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.842095 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.867334 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.868037 4812 scope.go:117] "RemoveContainer" containerID="bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" Jan 31 04:48:10 crc kubenswrapper[4812]: E0131 04:48:10.869107 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": container with ID starting with bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9 not found: ID does not exist" containerID="bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.869150 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9"} err="failed to get container status \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": rpc error: code = NotFound desc = could not find container \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": container with ID starting with bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.869178 4812 scope.go:117] "RemoveContainer" containerID="d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" Jan 31 04:48:10 crc kubenswrapper[4812]: E0131 04:48:10.869398 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": container with ID starting with d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947 not found: ID does not exist" containerID="d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.869525 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947"} err="failed to get container status \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": rpc error: code = NotFound desc = could not find container \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": container with ID starting with d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.869542 4812 scope.go:117] "RemoveContainer" containerID="7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870893 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870926 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870939 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870950 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870960 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/5c2c0959-3069-4fde-b59d-1c265497ccda-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870972 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870983 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.870994 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4jrhv\" (UniqueName: \"kubernetes.io/projected/8db441b6-c1f4-442d-b274-e4a80d9340fa-kube-api-access-4jrhv\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871008 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871019 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871043 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871059 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871071 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871081 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871091 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871101 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871111 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8db441b6-c1f4-442d-b274-e4a80d9340fa-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871122 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/cba4939b-2319-41a5-8471-7b405315de18-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: E0131 04:48:10.871036 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": container with ID starting with 7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4 not found: ID does not exist" containerID="7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871164 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4"} err="failed to get container status \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": rpc error: code = NotFound desc = could not find container \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": container with ID starting with 7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871195 4812 scope.go:117] "RemoveContainer" containerID="bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.871131 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8db441b6-c1f4-442d-b274-e4a80d9340fa-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.872345 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9"} err="failed to get container status \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": rpc error: code = NotFound desc = could not find container \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": container with ID starting with bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.872473 4812 scope.go:117] "RemoveContainer" containerID="d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.872890 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947"} err="failed to get container status \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": rpc error: code = NotFound desc = could not find container \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": container with ID starting with d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.872978 4812 scope.go:117] "RemoveContainer" containerID="7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.873110 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-config-data" (OuterVolumeSpecName: "config-data") pod "8db441b6-c1f4-442d-b274-e4a80d9340fa" (UID: "8db441b6-c1f4-442d-b274-e4a80d9340fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.873385 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4"} err="failed to get container status \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": rpc error: code = NotFound desc = could not find container \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": container with ID starting with 7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.873482 4812 scope.go:117] "RemoveContainer" containerID="bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.874031 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9"} err="failed to get container status \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": rpc error: code = NotFound desc = could not find container \"bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9\": container with ID starting with bb59b43a241d1ac0ad00da8b9af27b12b6115d7d97a5cad8392d7fecc331dab9 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.874071 4812 scope.go:117] "RemoveContainer" containerID="d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.874465 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947"} err="failed to get container status \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": rpc error: code = NotFound desc = could not find container \"d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947\": container with ID starting with d09fe361c99478615d56e7fcbeed787b8bfa1523f96bc81962e362d8c263b947 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.874489 4812 scope.go:117] "RemoveContainer" containerID="7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.876468 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4"} err="failed to get container status \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": rpc error: code = NotFound desc = could not find container \"7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4\": container with ID starting with 7dda78973b219edd89f19b82fe09e0ba7c31d7cffb9b5d5df8d4144fa9114bf4 not found: ID does not exist" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.876656 4812 scope.go:117] "RemoveContainer" containerID="f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.877688 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.886038 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.889986 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.892107 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.892332 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.899214 4812 scope.go:117] "RemoveContainer" containerID="e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.972631 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8db441b6-c1f4-442d-b274-e4a80d9340fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.972666 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.972678 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.979951 4812 scope.go:117] "RemoveContainer" containerID="34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" Jan 31 04:48:10 crc kubenswrapper[4812]: I0131 04:48:10.999707 4812 scope.go:117] "RemoveContainer" containerID="f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.000155 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": container with ID starting with f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0 not found: ID does not exist" containerID="f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.000203 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0"} err="failed to get container status \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": rpc error: code = NotFound desc = could not find container \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": container with ID starting with f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.000226 4812 scope.go:117] "RemoveContainer" containerID="e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.000587 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": container with ID starting with e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57 not found: ID does not exist" containerID="e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.000616 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57"} err="failed to get container status \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": rpc error: code = NotFound desc = could not find container \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": container with ID starting with e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.000635 4812 scope.go:117] "RemoveContainer" containerID="34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.001057 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": container with ID starting with 34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3 not found: ID does not exist" containerID="34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.001085 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3"} err="failed to get container status \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": rpc error: code = NotFound desc = could not find container \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": container with ID starting with 34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.001102 4812 scope.go:117] "RemoveContainer" containerID="f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.001324 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0"} err="failed to get container status \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": rpc error: code = NotFound desc = could not find container \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": container with ID starting with f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.001348 4812 scope.go:117] "RemoveContainer" containerID="e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.001784 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57"} err="failed to get container status \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": rpc error: code = NotFound desc = could not find container \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": container with ID starting with e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.001827 4812 scope.go:117] "RemoveContainer" containerID="34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.002220 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3"} err="failed to get container status \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": rpc error: code = NotFound desc = could not find container \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": container with ID starting with 34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.002335 4812 scope.go:117] "RemoveContainer" containerID="f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.003554 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0"} err="failed to get container status \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": rpc error: code = NotFound desc = could not find container \"f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0\": container with ID starting with f0ef639a0dab9fc00a86e9ce6bb3a057566ec6f261d46f6dd6a9e80fea61a6f0 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.003577 4812 scope.go:117] "RemoveContainer" containerID="e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.003808 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57"} err="failed to get container status \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": rpc error: code = NotFound desc = could not find container \"e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57\": container with ID starting with e4ae56dc30ffe2ce05daf84c2749f9b4f3b77886dd78de9f5a3f57dea612fd57 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.003917 4812 scope.go:117] "RemoveContainer" containerID="34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.004204 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3"} err="failed to get container status \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": rpc error: code = NotFound desc = could not find container \"34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3\": container with ID starting with 34657afbc79df18acb0d7ac56ac2e69c0b3afdb385e7c0387809ee3802a825e3 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.004227 4812 scope.go:117] "RemoveContainer" containerID="130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.029231 4812 scope.go:117] "RemoveContainer" containerID="d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.046928 4812 scope.go:117] "RemoveContainer" containerID="2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.064767 4812 scope.go:117] "RemoveContainer" containerID="130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.066818 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": container with ID starting with 130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f not found: ID does not exist" containerID="130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.066871 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f"} err="failed to get container status \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": rpc error: code = NotFound desc = could not find container \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": container with ID starting with 130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.066897 4812 scope.go:117] "RemoveContainer" containerID="d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.067269 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": container with ID starting with d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246 not found: ID does not exist" containerID="d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.067338 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246"} err="failed to get container status \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": rpc error: code = NotFound desc = could not find container \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": container with ID starting with d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.067371 4812 scope.go:117] "RemoveContainer" containerID="2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.067792 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": container with ID starting with 2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103 not found: ID does not exist" containerID="2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.067848 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103"} err="failed to get container status \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": rpc error: code = NotFound desc = could not find container \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": container with ID starting with 2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.067870 4812 scope.go:117] "RemoveContainer" containerID="130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.068135 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f"} err="failed to get container status \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": rpc error: code = NotFound desc = could not find container \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": container with ID starting with 130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.068156 4812 scope.go:117] "RemoveContainer" containerID="d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.068457 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246"} err="failed to get container status \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": rpc error: code = NotFound desc = could not find container \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": container with ID starting with d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.068486 4812 scope.go:117] "RemoveContainer" containerID="2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.068850 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103"} err="failed to get container status \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": rpc error: code = NotFound desc = could not find container \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": container with ID starting with 2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.068873 4812 scope.go:117] "RemoveContainer" containerID="130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.069274 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f"} err="failed to get container status \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": rpc error: code = NotFound desc = could not find container \"130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f\": container with ID starting with 130cb5eff746cac783366a2f87e2e8b55e59a27542af782e43fc942cfe2e4d9f not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.069340 4812 scope.go:117] "RemoveContainer" containerID="d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.069708 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246"} err="failed to get container status \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": rpc error: code = NotFound desc = could not find container \"d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246\": container with ID starting with d47b53e0df463825219a8e1ea70872e0155bd024035458a70bf0f79bcbba8246 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.069729 4812 scope.go:117] "RemoveContainer" containerID="2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.070186 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103"} err="failed to get container status \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": rpc error: code = NotFound desc = could not find container \"2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103\": container with ID starting with 2c03bec05c570fa5c3a57b4ff515be01d280236dbc94217e6513a1959dfa6103 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.070239 4812 scope.go:117] "RemoveContainer" containerID="aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.094113 4812 scope.go:117] "RemoveContainer" containerID="34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.115453 4812 scope.go:117] "RemoveContainer" containerID="932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.139661 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.141673 4812 scope.go:117] "RemoveContainer" containerID="aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.143474 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": container with ID starting with aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e not found: ID does not exist" containerID="aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.143524 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e"} err="failed to get container status \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": rpc error: code = NotFound desc = could not find container \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": container with ID starting with aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.143558 4812 scope.go:117] "RemoveContainer" containerID="34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.144020 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": container with ID starting with 34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246 not found: ID does not exist" containerID="34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.144126 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246"} err="failed to get container status \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": rpc error: code = NotFound desc = could not find container \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": container with ID starting with 34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.144198 4812 scope.go:117] "RemoveContainer" containerID="932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" Jan 31 04:48:11 crc kubenswrapper[4812]: E0131 04:48:11.144799 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": container with ID starting with 932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775 not found: ID does not exist" containerID="932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.144856 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775"} err="failed to get container status \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": rpc error: code = NotFound desc = could not find container \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": container with ID starting with 932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.144880 4812 scope.go:117] "RemoveContainer" containerID="aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.145295 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e"} err="failed to get container status \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": rpc error: code = NotFound desc = could not find container \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": container with ID starting with aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.145318 4812 scope.go:117] "RemoveContainer" containerID="34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.145602 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246"} err="failed to get container status \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": rpc error: code = NotFound desc = could not find container \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": container with ID starting with 34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.145630 4812 scope.go:117] "RemoveContainer" containerID="932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.147681 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775"} err="failed to get container status \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": rpc error: code = NotFound desc = could not find container \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": container with ID starting with 932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.147772 4812 scope.go:117] "RemoveContainer" containerID="aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.148696 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e"} err="failed to get container status \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": rpc error: code = NotFound desc = could not find container \"aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e\": container with ID starting with aff4f9df32231ae68771db8bb9a9160c7943fcf411e9e19a96cf0047b0d67f3e not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.148725 4812 scope.go:117] "RemoveContainer" containerID="34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.149339 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246"} err="failed to get container status \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": rpc error: code = NotFound desc = could not find container \"34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246\": container with ID starting with 34ac83eac7e6e08b9bf7d835e0bd4cbfe4f416546e73e76cf144168eeef27246 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.149384 4812 scope.go:117] "RemoveContainer" containerID="932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.149696 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775"} err="failed to get container status \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": rpc error: code = NotFound desc = could not find container \"932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775\": container with ID starting with 932ccc416113afeb1642715a5e49df704d6a457208f5f2895f75c1b425749775 not found: ID does not exist" Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.151384 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.769924 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.770590 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-log" containerID="cri-o://48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" gracePeriod=30 Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.770671 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-api" containerID="cri-o://43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" gracePeriod=30 Jan 31 04:48:11 crc kubenswrapper[4812]: I0131 04:48:11.770731 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-httpd" containerID="cri-o://3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" gracePeriod=30 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.271637 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.274508 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-log" containerID="cri-o://88e1bcad1eaf5302747b4e15a51e0f87ec3f612549475615a8b8074643e26cd5" gracePeriod=30 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.274663 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-api" containerID="cri-o://a3e32a28184e365c87ed6478bc1f2546b5682e97e871939b919694391d1da710" gracePeriod=30 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.274699 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-httpd" containerID="cri-o://5e30560af9499ca05fdc3aa7d51047b5836a053b55102c0a2d3274593dba2818" gracePeriod=30 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.352643 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" path="/var/lib/kubelet/pods/532e2e2a-49ec-4141-ae9f-c61830fc352c/volumes" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.353684 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" path="/var/lib/kubelet/pods/5c2c0959-3069-4fde-b59d-1c265497ccda/volumes" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.355880 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" path="/var/lib/kubelet/pods/8db441b6-c1f4-442d-b274-e4a80d9340fa/volumes" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.356871 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="cba4939b-2319-41a5-8471-7b405315de18" path="/var/lib/kubelet/pods/cba4939b-2319-41a5-8471-7b405315de18/volumes" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.627948 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697534 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-iscsi\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697652 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9g4w6\" (UniqueName: \"kubernetes.io/projected/c849d85a-5e07-42e8-98f3-506b9e711ae7-kube-api-access-9g4w6\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697689 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697748 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697793 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-sys\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697874 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-scripts\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697885 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-sys" (OuterVolumeSpecName: "sys") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697944 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-lib-modules\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.697990 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-nvme\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698052 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-config-data\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698084 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698072 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698102 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-run\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698156 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-run" (OuterVolumeSpecName: "run") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698205 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698257 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-var-locks-brick\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698337 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-dev\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698379 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-httpd-run\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698420 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-logs\") pod \"c849d85a-5e07-42e8-98f3-506b9e711ae7\" (UID: \"c849d85a-5e07-42e8-98f3-506b9e711ae7\") " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698421 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.698488 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-dev" (OuterVolumeSpecName: "dev") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699126 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699155 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699172 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699190 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699191 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699206 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699264 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699292 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c849d85a-5e07-42e8-98f3-506b9e711ae7-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.699700 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-logs" (OuterVolumeSpecName: "logs") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.702192 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.702595 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-scripts" (OuterVolumeSpecName: "scripts") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.703263 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.703451 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c849d85a-5e07-42e8-98f3-506b9e711ae7-kube-api-access-9g4w6" (OuterVolumeSpecName: "kube-api-access-9g4w6") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "kube-api-access-9g4w6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.779579 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-config-data" (OuterVolumeSpecName: "config-data") pod "c849d85a-5e07-42e8-98f3-506b9e711ae7" (UID: "c849d85a-5e07-42e8-98f3-506b9e711ae7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802250 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802289 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802302 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c849d85a-5e07-42e8-98f3-506b9e711ae7-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802321 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802333 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802345 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c849d85a-5e07-42e8-98f3-506b9e711ae7-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.802358 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9g4w6\" (UniqueName: \"kubernetes.io/projected/c849d85a-5e07-42e8-98f3-506b9e711ae7-kube-api-access-9g4w6\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.819950 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.821401 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.845983 4812 generic.go:334] "Generic (PLEG): container finished" podID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerID="43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" exitCode=0 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846370 4812 generic.go:334] "Generic (PLEG): container finished" podID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerID="3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" exitCode=0 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846393 4812 generic.go:334] "Generic (PLEG): container finished" podID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerID="48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" exitCode=143 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846293 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846189 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerDied","Data":"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846508 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerDied","Data":"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846532 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerDied","Data":"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846551 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c849d85a-5e07-42e8-98f3-506b9e711ae7","Type":"ContainerDied","Data":"d62f0c0f9230bf98391a5dd08d9766d994a47be992898fc450a34f9b4495d1e0"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.846576 4812 scope.go:117] "RemoveContainer" containerID="43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.852133 4812 generic.go:334] "Generic (PLEG): container finished" podID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerID="a3e32a28184e365c87ed6478bc1f2546b5682e97e871939b919694391d1da710" exitCode=0 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.852167 4812 generic.go:334] "Generic (PLEG): container finished" podID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerID="5e30560af9499ca05fdc3aa7d51047b5836a053b55102c0a2d3274593dba2818" exitCode=0 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.852175 4812 generic.go:334] "Generic (PLEG): container finished" podID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerID="88e1bcad1eaf5302747b4e15a51e0f87ec3f612549475615a8b8074643e26cd5" exitCode=143 Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.852196 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerDied","Data":"a3e32a28184e365c87ed6478bc1f2546b5682e97e871939b919694391d1da710"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.852224 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerDied","Data":"5e30560af9499ca05fdc3aa7d51047b5836a053b55102c0a2d3274593dba2818"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.852234 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerDied","Data":"88e1bcad1eaf5302747b4e15a51e0f87ec3f612549475615a8b8074643e26cd5"} Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.892590 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.898014 4812 scope.go:117] "RemoveContainer" containerID="3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.902429 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.903536 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.903565 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.923258 4812 scope.go:117] "RemoveContainer" containerID="48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.948607 4812 scope.go:117] "RemoveContainer" containerID="43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" Jan 31 04:48:12 crc kubenswrapper[4812]: E0131 04:48:12.948988 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": container with ID starting with 43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78 not found: ID does not exist" containerID="43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.949043 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78"} err="failed to get container status \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": rpc error: code = NotFound desc = could not find container \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": container with ID starting with 43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.949075 4812 scope.go:117] "RemoveContainer" containerID="3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" Jan 31 04:48:12 crc kubenswrapper[4812]: E0131 04:48:12.949756 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": container with ID starting with 3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6 not found: ID does not exist" containerID="3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.949795 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6"} err="failed to get container status \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": rpc error: code = NotFound desc = could not find container \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": container with ID starting with 3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.949821 4812 scope.go:117] "RemoveContainer" containerID="48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" Jan 31 04:48:12 crc kubenswrapper[4812]: E0131 04:48:12.950128 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": container with ID starting with 48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526 not found: ID does not exist" containerID="48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.950176 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526"} err="failed to get container status \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": rpc error: code = NotFound desc = could not find container \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": container with ID starting with 48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.950209 4812 scope.go:117] "RemoveContainer" containerID="43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.957675 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78"} err="failed to get container status \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": rpc error: code = NotFound desc = could not find container \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": container with ID starting with 43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.957715 4812 scope.go:117] "RemoveContainer" containerID="3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.958224 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6"} err="failed to get container status \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": rpc error: code = NotFound desc = could not find container \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": container with ID starting with 3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.958274 4812 scope.go:117] "RemoveContainer" containerID="48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.958666 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526"} err="failed to get container status \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": rpc error: code = NotFound desc = could not find container \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": container with ID starting with 48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.958691 4812 scope.go:117] "RemoveContainer" containerID="43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.963375 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78"} err="failed to get container status \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": rpc error: code = NotFound desc = could not find container \"43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78\": container with ID starting with 43cd4926d5d70579ef71cce5206666b0906907eacce1ebd47a0cd8c7f5305b78 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.963411 4812 scope.go:117] "RemoveContainer" containerID="3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.965386 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6"} err="failed to get container status \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": rpc error: code = NotFound desc = could not find container \"3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6\": container with ID starting with 3f4a20195c6f47d6466a1608ad92873884a4d8ad5fd711594d6500f4c5bc5ad6 not found: ID does not exist" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.965427 4812 scope.go:117] "RemoveContainer" containerID="48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526" Jan 31 04:48:12 crc kubenswrapper[4812]: I0131 04:48:12.965688 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526"} err="failed to get container status \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": rpc error: code = NotFound desc = could not find container \"48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526\": container with ID starting with 48a4e48ff7f1a3578e44bcec54a58be117d95365577a716f18b6da900b1ed526 not found: ID does not exist" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.077557 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209563 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209637 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-httpd-run\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209691 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209730 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-sys\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209755 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-dev\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209798 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-scripts\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209878 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-sys" (OuterVolumeSpecName: "sys") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209886 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-config-data\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209941 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-dev" (OuterVolumeSpecName: "dev") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209971 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-run\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.209985 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210004 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-var-locks-brick\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210036 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-run" (OuterVolumeSpecName: "run") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210046 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fvgtq\" (UniqueName: \"kubernetes.io/projected/9a4530e6-7d8e-4821-b55e-3489281c477d-kube-api-access-fvgtq\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210080 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-iscsi\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210107 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-nvme\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210148 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-lib-modules\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210196 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-logs\") pod \"9a4530e6-7d8e-4821-b55e-3489281c477d\" (UID: \"9a4530e6-7d8e-4821-b55e-3489281c477d\") " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210451 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210482 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210517 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210638 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210665 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210681 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210695 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210712 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210728 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.210741 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.211016 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-logs" (OuterVolumeSpecName: "logs") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.211195 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.213054 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-scripts" (OuterVolumeSpecName: "scripts") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.214450 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.215403 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a4530e6-7d8e-4821-b55e-3489281c477d-kube-api-access-fvgtq" (OuterVolumeSpecName: "kube-api-access-fvgtq") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "kube-api-access-fvgtq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.220062 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.278707 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-config-data" (OuterVolumeSpecName: "config-data") pod "9a4530e6-7d8e-4821-b55e-3489281c477d" (UID: "9a4530e6-7d8e-4821-b55e-3489281c477d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312701 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312785 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312869 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312891 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/9a4530e6-7d8e-4821-b55e-3489281c477d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312911 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-fvgtq\" (UniqueName: \"kubernetes.io/projected/9a4530e6-7d8e-4821-b55e-3489281c477d-kube-api-access-fvgtq\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312958 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/9a4530e6-7d8e-4821-b55e-3489281c477d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.312974 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/9a4530e6-7d8e-4821-b55e-3489281c477d-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.333910 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.336638 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.414225 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.414281 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.865037 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"9a4530e6-7d8e-4821-b55e-3489281c477d","Type":"ContainerDied","Data":"24bb03fb5bf6f9e34d428536f734d4efb64cb6e12d732c97d30a3c58f7ef3764"} Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.865062 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.865145 4812 scope.go:117] "RemoveContainer" containerID="a3e32a28184e365c87ed6478bc1f2546b5682e97e871939b919694391d1da710" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.898556 4812 scope.go:117] "RemoveContainer" containerID="5e30560af9499ca05fdc3aa7d51047b5836a053b55102c0a2d3274593dba2818" Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.912495 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.924720 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:48:13 crc kubenswrapper[4812]: I0131 04:48:13.936522 4812 scope.go:117] "RemoveContainer" containerID="88e1bcad1eaf5302747b4e15a51e0f87ec3f612549475615a8b8074643e26cd5" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.356407 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" path="/var/lib/kubelet/pods/9a4530e6-7d8e-4821-b55e-3489281c477d/volumes" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.357939 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" path="/var/lib/kubelet/pods/c849d85a-5e07-42e8-98f3-506b9e711ae7/volumes" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.940611 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rns6p"] Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.956278 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-rns6p"] Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971436 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance3e69-account-delete-jsn86"] Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971780 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971802 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971873 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971887 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971902 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971911 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971921 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971929 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971938 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971945 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971967 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971976 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.971987 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.971995 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972007 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972016 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972037 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972045 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972063 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972071 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972088 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972096 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972111 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972119 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972132 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972140 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972156 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972164 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972181 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972189 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972205 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972212 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972226 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972234 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: E0131 04:48:14.972244 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972252 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972407 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972421 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972434 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972443 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972453 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972464 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972474 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8db441b6-c1f4-442d-b274-e4a80d9340fa" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972485 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="cba4939b-2319-41a5-8471-7b405315de18" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972495 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972504 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972513 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972527 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972537 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972551 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9a4530e6-7d8e-4821-b55e-3489281c477d" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972560 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5c2c0959-3069-4fde-b59d-1c265497ccda" containerName="glance-api" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972573 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-httpd" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972585 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c849d85a-5e07-42e8-98f3-506b9e711ae7" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.972598 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="532e2e2a-49ec-4141-ae9f-c61830fc352c" containerName="glance-log" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.973208 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:14 crc kubenswrapper[4812]: I0131 04:48:14.979523 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3e69-account-delete-jsn86"] Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.137422 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbqlc\" (UniqueName: \"kubernetes.io/projected/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-kube-api-access-lbqlc\") pod \"glance3e69-account-delete-jsn86\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.138102 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-operator-scripts\") pod \"glance3e69-account-delete-jsn86\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.239855 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lbqlc\" (UniqueName: \"kubernetes.io/projected/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-kube-api-access-lbqlc\") pod \"glance3e69-account-delete-jsn86\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.239918 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-operator-scripts\") pod \"glance3e69-account-delete-jsn86\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.240620 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-operator-scripts\") pod \"glance3e69-account-delete-jsn86\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.260880 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbqlc\" (UniqueName: \"kubernetes.io/projected/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-kube-api-access-lbqlc\") pod \"glance3e69-account-delete-jsn86\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.291083 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.587226 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance3e69-account-delete-jsn86"] Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.884366 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" event={"ID":"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5","Type":"ContainerStarted","Data":"cd2b227df70adcd77a1fd00609f84655ff153e429369e1133b0221e400f9f704"} Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.884634 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" event={"ID":"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5","Type":"ContainerStarted","Data":"bdbd3916b5a21ddff420d8126acc4d592b899bfae8610b7f160e833a96e84abe"} Jan 31 04:48:15 crc kubenswrapper[4812]: I0131 04:48:15.898027 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" podStartSLOduration=1.8980120409999999 podStartE2EDuration="1.898012041s" podCreationTimestamp="2026-01-31 04:48:14 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:15.895227106 +0000 UTC m=+1304.390248771" watchObservedRunningTime="2026-01-31 04:48:15.898012041 +0000 UTC m=+1304.393033706" Jan 31 04:48:16 crc kubenswrapper[4812]: I0131 04:48:16.354659 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f05ab78-8195-45b9-8370-fd54e6ef1e75" path="/var/lib/kubelet/pods/1f05ab78-8195-45b9-8370-fd54e6ef1e75/volumes" Jan 31 04:48:16 crc kubenswrapper[4812]: I0131 04:48:16.893643 4812 generic.go:334] "Generic (PLEG): container finished" podID="52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" containerID="cd2b227df70adcd77a1fd00609f84655ff153e429369e1133b0221e400f9f704" exitCode=0 Jan 31 04:48:16 crc kubenswrapper[4812]: I0131 04:48:16.893700 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" event={"ID":"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5","Type":"ContainerDied","Data":"cd2b227df70adcd77a1fd00609f84655ff153e429369e1133b0221e400f9f704"} Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.156720 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.284776 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lbqlc\" (UniqueName: \"kubernetes.io/projected/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-kube-api-access-lbqlc\") pod \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.284865 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-operator-scripts\") pod \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\" (UID: \"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5\") " Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.285825 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" (UID: "52f37ff6-7c65-4a3c-9518-eac5f84bb4b5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.291365 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-kube-api-access-lbqlc" (OuterVolumeSpecName: "kube-api-access-lbqlc") pod "52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" (UID: "52f37ff6-7c65-4a3c-9518-eac5f84bb4b5"). InnerVolumeSpecName "kube-api-access-lbqlc". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.387459 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lbqlc\" (UniqueName: \"kubernetes.io/projected/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-kube-api-access-lbqlc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.387503 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.914015 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" event={"ID":"52f37ff6-7c65-4a3c-9518-eac5f84bb4b5","Type":"ContainerDied","Data":"bdbd3916b5a21ddff420d8126acc4d592b899bfae8610b7f160e833a96e84abe"} Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.914082 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="bdbd3916b5a21ddff420d8126acc4d592b899bfae8610b7f160e833a96e84abe" Jan 31 04:48:18 crc kubenswrapper[4812]: I0131 04:48:18.914126 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance3e69-account-delete-jsn86" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.010919 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-z5xnk"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.022076 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-z5xnk"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.037942 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance3e69-account-delete-jsn86"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.049043 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance3e69-account-delete-jsn86"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.055070 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-3e69-account-create-update-8nv9x"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.061154 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-3e69-account-create-update-8nv9x"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.350249 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4" path="/var/lib/kubelet/pods/4d7a410c-5f45-43ba-8f11-f3d5ab50b2c4/volumes" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.350848 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" path="/var/lib/kubelet/pods/52f37ff6-7c65-4a3c-9518-eac5f84bb4b5/volumes" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.351396 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d7fd73e1-c08b-4d86-992d-42da4bee71ec" path="/var/lib/kubelet/pods/d7fd73e1-c08b-4d86-992d-42da4bee71ec/volumes" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.494318 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-qkk5m"] Jan 31 04:48:20 crc kubenswrapper[4812]: E0131 04:48:20.495496 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" containerName="mariadb-account-delete" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.495635 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" containerName="mariadb-account-delete" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.496220 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="52f37ff6-7c65-4a3c-9518-eac5f84bb4b5" containerName="mariadb-account-delete" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.497347 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.566678 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-qkk5m"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.577467 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-2b13-account-create-update-x5b6p"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.578483 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.581405 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.593590 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2b13-account-create-update-x5b6p"] Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.662678 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j9kb7\" (UniqueName: \"kubernetes.io/projected/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-kube-api-access-j9kb7\") pod \"glance-db-create-qkk5m\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.662734 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-operator-scripts\") pod \"glance-2b13-account-create-update-x5b6p\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.662755 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-operator-scripts\") pod \"glance-db-create-qkk5m\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.663036 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4mww\" (UniqueName: \"kubernetes.io/projected/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-kube-api-access-k4mww\") pod \"glance-2b13-account-create-update-x5b6p\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.763955 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-operator-scripts\") pod \"glance-2b13-account-create-update-x5b6p\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.764236 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-operator-scripts\") pod \"glance-db-create-qkk5m\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.764359 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-k4mww\" (UniqueName: \"kubernetes.io/projected/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-kube-api-access-k4mww\") pod \"glance-2b13-account-create-update-x5b6p\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.764435 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-j9kb7\" (UniqueName: \"kubernetes.io/projected/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-kube-api-access-j9kb7\") pod \"glance-db-create-qkk5m\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.766938 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-operator-scripts\") pod \"glance-2b13-account-create-update-x5b6p\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.767331 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-operator-scripts\") pod \"glance-db-create-qkk5m\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.785424 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-j9kb7\" (UniqueName: \"kubernetes.io/projected/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-kube-api-access-j9kb7\") pod \"glance-db-create-qkk5m\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.800046 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4mww\" (UniqueName: \"kubernetes.io/projected/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-kube-api-access-k4mww\") pod \"glance-2b13-account-create-update-x5b6p\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.863052 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:20 crc kubenswrapper[4812]: I0131 04:48:20.891545 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.185924 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-qkk5m"] Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.223242 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-2b13-account-create-update-x5b6p"] Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.956111 4812 generic.go:334] "Generic (PLEG): container finished" podID="06a85d8f-31d6-470f-8169-4bb0b8d8fe87" containerID="3cec6b79468f1dc473449d009be78ed472173e623493772d3e2578cb84aa1eb6" exitCode=0 Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.956399 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-qkk5m" event={"ID":"06a85d8f-31d6-470f-8169-4bb0b8d8fe87","Type":"ContainerDied","Data":"3cec6b79468f1dc473449d009be78ed472173e623493772d3e2578cb84aa1eb6"} Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.956471 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-qkk5m" event={"ID":"06a85d8f-31d6-470f-8169-4bb0b8d8fe87","Type":"ContainerStarted","Data":"b5294617fae6b198ed4ac4386afe85eaa2c6bcb22311387a35f71f7d944f6ede"} Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.960501 4812 generic.go:334] "Generic (PLEG): container finished" podID="fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" containerID="25b9311210eee8271b5ac70f79a7c1f9add1137009ef0612cf6443e0f3b7dd00" exitCode=0 Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.960540 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" event={"ID":"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8","Type":"ContainerDied","Data":"25b9311210eee8271b5ac70f79a7c1f9add1137009ef0612cf6443e0f3b7dd00"} Jan 31 04:48:21 crc kubenswrapper[4812]: I0131 04:48:21.960562 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" event={"ID":"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8","Type":"ContainerStarted","Data":"21eb37c119c5b014984a2ede9870232b32a3cc64dbf2685a018b32338c097f37"} Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.297866 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.304480 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j9kb7\" (UniqueName: \"kubernetes.io/projected/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-kube-api-access-j9kb7\") pod \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.304645 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-operator-scripts\") pod \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\" (UID: \"06a85d8f-31d6-470f-8169-4bb0b8d8fe87\") " Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.306077 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "06a85d8f-31d6-470f-8169-4bb0b8d8fe87" (UID: "06a85d8f-31d6-470f-8169-4bb0b8d8fe87"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.320108 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-kube-api-access-j9kb7" (OuterVolumeSpecName: "kube-api-access-j9kb7") pod "06a85d8f-31d6-470f-8169-4bb0b8d8fe87" (UID: "06a85d8f-31d6-470f-8169-4bb0b8d8fe87"). InnerVolumeSpecName "kube-api-access-j9kb7". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.371307 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.405469 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k4mww\" (UniqueName: \"kubernetes.io/projected/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-kube-api-access-k4mww\") pod \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.405626 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-operator-scripts\") pod \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\" (UID: \"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8\") " Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.405954 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-j9kb7\" (UniqueName: \"kubernetes.io/projected/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-kube-api-access-j9kb7\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.405994 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/06a85d8f-31d6-470f-8169-4bb0b8d8fe87-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.406133 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" (UID: "fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.410006 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-kube-api-access-k4mww" (OuterVolumeSpecName: "kube-api-access-k4mww") pod "fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" (UID: "fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8"). InnerVolumeSpecName "kube-api-access-k4mww". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.507800 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k4mww\" (UniqueName: \"kubernetes.io/projected/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-kube-api-access-k4mww\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.507904 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.984942 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-qkk5m" event={"ID":"06a85d8f-31d6-470f-8169-4bb0b8d8fe87","Type":"ContainerDied","Data":"b5294617fae6b198ed4ac4386afe85eaa2c6bcb22311387a35f71f7d944f6ede"} Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.984984 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b5294617fae6b198ed4ac4386afe85eaa2c6bcb22311387a35f71f7d944f6ede" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.985039 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-qkk5m" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.996412 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" event={"ID":"fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8","Type":"ContainerDied","Data":"21eb37c119c5b014984a2ede9870232b32a3cc64dbf2685a018b32338c097f37"} Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.996452 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="21eb37c119c5b014984a2ede9870232b32a3cc64dbf2685a018b32338c097f37" Jan 31 04:48:23 crc kubenswrapper[4812]: I0131 04:48:23.996599 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-2b13-account-create-update-x5b6p" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.725471 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-p4bcx"] Jan 31 04:48:25 crc kubenswrapper[4812]: E0131 04:48:25.726003 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="06a85d8f-31d6-470f-8169-4bb0b8d8fe87" containerName="mariadb-database-create" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.726019 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="06a85d8f-31d6-470f-8169-4bb0b8d8fe87" containerName="mariadb-database-create" Jan 31 04:48:25 crc kubenswrapper[4812]: E0131 04:48:25.726043 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" containerName="mariadb-account-create-update" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.726051 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" containerName="mariadb-account-create-update" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.726229 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="06a85d8f-31d6-470f-8169-4bb0b8d8fe87" containerName="mariadb-database-create" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.726245 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" containerName="mariadb-account-create-update" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.726756 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.730176 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-4wb7b" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.730327 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.735177 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p4bcx"] Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.838422 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wtf7s\" (UniqueName: \"kubernetes.io/projected/0661a9af-d008-487a-b40e-081859d7aa65-kube-api-access-wtf7s\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.838823 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-db-sync-config-data\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.839015 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-config-data\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.940443 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-wtf7s\" (UniqueName: \"kubernetes.io/projected/0661a9af-d008-487a-b40e-081859d7aa65-kube-api-access-wtf7s\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.940567 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-db-sync-config-data\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.940635 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-config-data\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.946484 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-db-sync-config-data\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.957520 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-config-data\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:25 crc kubenswrapper[4812]: I0131 04:48:25.962293 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-wtf7s\" (UniqueName: \"kubernetes.io/projected/0661a9af-d008-487a-b40e-081859d7aa65-kube-api-access-wtf7s\") pod \"glance-db-sync-p4bcx\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:26 crc kubenswrapper[4812]: I0131 04:48:26.052283 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:26 crc kubenswrapper[4812]: I0131 04:48:26.524711 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p4bcx"] Jan 31 04:48:27 crc kubenswrapper[4812]: I0131 04:48:27.020668 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p4bcx" event={"ID":"0661a9af-d008-487a-b40e-081859d7aa65","Type":"ContainerStarted","Data":"b7c15be8cc678db92d5ada888eba722d0acf5bc96c7b649ebf97f93cab97ca76"} Jan 31 04:48:28 crc kubenswrapper[4812]: I0131 04:48:28.031232 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p4bcx" event={"ID":"0661a9af-d008-487a-b40e-081859d7aa65","Type":"ContainerStarted","Data":"4b377c58a34afd1cf83cbcda6b639f017fa79cedaa3ca68b6e33353d35e8487a"} Jan 31 04:48:28 crc kubenswrapper[4812]: I0131 04:48:28.055169 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-p4bcx" podStartSLOduration=3.055146667 podStartE2EDuration="3.055146667s" podCreationTimestamp="2026-01-31 04:48:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:28.053944955 +0000 UTC m=+1316.548966650" watchObservedRunningTime="2026-01-31 04:48:28.055146667 +0000 UTC m=+1316.550168362" Jan 31 04:48:30 crc kubenswrapper[4812]: I0131 04:48:30.053696 4812 generic.go:334] "Generic (PLEG): container finished" podID="0661a9af-d008-487a-b40e-081859d7aa65" containerID="4b377c58a34afd1cf83cbcda6b639f017fa79cedaa3ca68b6e33353d35e8487a" exitCode=0 Jan 31 04:48:30 crc kubenswrapper[4812]: I0131 04:48:30.053951 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p4bcx" event={"ID":"0661a9af-d008-487a-b40e-081859d7aa65","Type":"ContainerDied","Data":"4b377c58a34afd1cf83cbcda6b639f017fa79cedaa3ca68b6e33353d35e8487a"} Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.415064 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.426878 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-config-data\") pod \"0661a9af-d008-487a-b40e-081859d7aa65\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.426947 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wtf7s\" (UniqueName: \"kubernetes.io/projected/0661a9af-d008-487a-b40e-081859d7aa65-kube-api-access-wtf7s\") pod \"0661a9af-d008-487a-b40e-081859d7aa65\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.427028 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-db-sync-config-data\") pod \"0661a9af-d008-487a-b40e-081859d7aa65\" (UID: \"0661a9af-d008-487a-b40e-081859d7aa65\") " Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.438197 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "0661a9af-d008-487a-b40e-081859d7aa65" (UID: "0661a9af-d008-487a-b40e-081859d7aa65"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.453169 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0661a9af-d008-487a-b40e-081859d7aa65-kube-api-access-wtf7s" (OuterVolumeSpecName: "kube-api-access-wtf7s") pod "0661a9af-d008-487a-b40e-081859d7aa65" (UID: "0661a9af-d008-487a-b40e-081859d7aa65"). InnerVolumeSpecName "kube-api-access-wtf7s". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.473575 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-config-data" (OuterVolumeSpecName: "config-data") pod "0661a9af-d008-487a-b40e-081859d7aa65" (UID: "0661a9af-d008-487a-b40e-081859d7aa65"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.529015 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.529046 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wtf7s\" (UniqueName: \"kubernetes.io/projected/0661a9af-d008-487a-b40e-081859d7aa65-kube-api-access-wtf7s\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:31 crc kubenswrapper[4812]: I0131 04:48:31.529057 4812 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/0661a9af-d008-487a-b40e-081859d7aa65-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:32 crc kubenswrapper[4812]: I0131 04:48:32.076602 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-p4bcx" event={"ID":"0661a9af-d008-487a-b40e-081859d7aa65","Type":"ContainerDied","Data":"b7c15be8cc678db92d5ada888eba722d0acf5bc96c7b649ebf97f93cab97ca76"} Jan 31 04:48:32 crc kubenswrapper[4812]: I0131 04:48:32.076672 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b7c15be8cc678db92d5ada888eba722d0acf5bc96c7b649ebf97f93cab97ca76" Jan 31 04:48:32 crc kubenswrapper[4812]: I0131 04:48:32.076629 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-p4bcx" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.146055 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:48:33 crc kubenswrapper[4812]: E0131 04:48:33.146655 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0661a9af-d008-487a-b40e-081859d7aa65" containerName="glance-db-sync" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.146667 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0661a9af-d008-487a-b40e-081859d7aa65" containerName="glance-db-sync" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.146793 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="0661a9af-d008-487a-b40e-081859d7aa65" containerName="glance-db-sync" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.147451 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.149265 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.150012 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-4wb7b" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.150079 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.213650 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.253459 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354523 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354572 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354601 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-run\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354631 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354657 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-logs\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354686 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354715 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354766 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354798 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354827 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-sys\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354881 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jfchz\" (UniqueName: \"kubernetes.io/projected/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-kube-api-access-jfchz\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354926 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.354968 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.355001 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-dev\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.355005 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.372243 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456756 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-dev\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456817 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456863 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456885 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-run\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456907 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-logs\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456926 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456940 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.456979 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457002 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457033 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-sys\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457058 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jfchz\" (UniqueName: \"kubernetes.io/projected/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-kube-api-access-jfchz\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457056 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457079 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457127 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457461 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457521 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457558 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-sys\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457562 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-run\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457631 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457684 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457700 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-dev\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457783 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.457903 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-logs\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.465722 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-scripts\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.466145 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-config-data\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.481190 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jfchz\" (UniqueName: \"kubernetes.io/projected/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-kube-api-access-jfchz\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.491235 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-0\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.730740 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.732250 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.734120 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.748930 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.768130 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862082 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862166 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862196 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862221 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-sys\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862345 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862384 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862419 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862598 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-dev\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862639 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862692 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862709 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862729 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-run\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862795 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.862832 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhzm2\" (UniqueName: \"kubernetes.io/projected/fd564f21-f972-49e1-b961-04a3bb3884fc-kube-api-access-lhzm2\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964274 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964312 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-sys\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964332 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964348 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964366 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964431 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-dev\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964455 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964477 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964493 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964511 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-run\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964530 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964553 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-lhzm2\" (UniqueName: \"kubernetes.io/projected/fd564f21-f972-49e1-b961-04a3bb3884fc-kube-api-access-lhzm2\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964576 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.964591 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965203 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965253 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965340 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965416 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-run\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965564 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965658 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965715 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-sys\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965740 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.965990 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-logs\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.966035 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.966370 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-dev\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.969777 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-config-data\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.972393 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-scripts\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.987307 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhzm2\" (UniqueName: \"kubernetes.io/projected/fd564f21-f972-49e1-b961-04a3bb3884fc-kube-api-access-lhzm2\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:33 crc kubenswrapper[4812]: I0131 04:48:33.988776 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:34 crc kubenswrapper[4812]: I0131 04:48:34.005163 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:34 crc kubenswrapper[4812]: I0131 04:48:34.055306 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:34 crc kubenswrapper[4812]: I0131 04:48:34.077361 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:34 crc kubenswrapper[4812]: I0131 04:48:34.231509 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:48:34 crc kubenswrapper[4812]: I0131 04:48:34.521342 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:34 crc kubenswrapper[4812]: W0131 04:48:34.528526 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podfd564f21_f972_49e1_b961_04a3bb3884fc.slice/crio-1e9f723922d70719d5f1e6ca582a036fd65216e4828e85771fa2f8de1e1b4419 WatchSource:0}: Error finding container 1e9f723922d70719d5f1e6ca582a036fd65216e4828e85771fa2f8de1e1b4419: Status 404 returned error can't find the container with id 1e9f723922d70719d5f1e6ca582a036fd65216e4828e85771fa2f8de1e1b4419 Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.104070 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd564f21-f972-49e1-b961-04a3bb3884fc","Type":"ContainerStarted","Data":"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202"} Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.104736 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd564f21-f972-49e1-b961-04a3bb3884fc","Type":"ContainerStarted","Data":"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a"} Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.104754 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd564f21-f972-49e1-b961-04a3bb3884fc","Type":"ContainerStarted","Data":"1e9f723922d70719d5f1e6ca582a036fd65216e4828e85771fa2f8de1e1b4419"} Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.104224 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-httpd" containerID="cri-o://5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202" gracePeriod=30 Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.104164 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-log" containerID="cri-o://441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a" gracePeriod=30 Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.108618 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"b2c47516-b290-4a70-a899-c0ffa0f8e3b9","Type":"ContainerStarted","Data":"ce6ea52ac92effcabb49d1ceda7589687801b9c4abd85f4371c43a64c444e466"} Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.108685 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"b2c47516-b290-4a70-a899-c0ffa0f8e3b9","Type":"ContainerStarted","Data":"f8b34c40f3b1dc3dcbd37f9daa024fefcaaa45604e5ccb593eb3a139f1d562e9"} Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.108718 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"b2c47516-b290-4a70-a899-c0ffa0f8e3b9","Type":"ContainerStarted","Data":"7293e299a5b755b3863563b98eea88c6ca814acd49ef123ecc796f95fbd6f18d"} Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.145202 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=3.145167947 podStartE2EDuration="3.145167947s" podCreationTimestamp="2026-01-31 04:48:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:35.142882675 +0000 UTC m=+1323.637904400" watchObservedRunningTime="2026-01-31 04:48:35.145167947 +0000 UTC m=+1323.640189662" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.181212 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=2.181184434 podStartE2EDuration="2.181184434s" podCreationTimestamp="2026-01-31 04:48:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:35.17435395 +0000 UTC m=+1323.669375645" watchObservedRunningTime="2026-01-31 04:48:35.181184434 +0000 UTC m=+1323.676206129" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.574188 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687139 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lhzm2\" (UniqueName: \"kubernetes.io/projected/fd564f21-f972-49e1-b961-04a3bb3884fc-kube-api-access-lhzm2\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687177 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-run\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687206 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-dev\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687229 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-nvme\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687258 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-lib-modules\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687276 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-httpd-run\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687298 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-sys\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687294 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-dev" (OuterVolumeSpecName: "dev") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687297 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-run" (OuterVolumeSpecName: "run") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687341 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687338 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687359 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687318 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-var-locks-brick\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687404 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687451 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-config-data\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687503 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-logs\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687528 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-scripts\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687574 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687595 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-iscsi\") pod \"fd564f21-f972-49e1-b961-04a3bb3884fc\" (UID: \"fd564f21-f972-49e1-b961-04a3bb3884fc\") " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687925 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687939 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687950 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687973 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687983 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687370 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-sys" (OuterVolumeSpecName: "sys") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.687583 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.688101 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.688287 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-logs" (OuterVolumeSpecName: "logs") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.692663 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-scripts" (OuterVolumeSpecName: "scripts") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.693058 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.693963 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.694276 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fd564f21-f972-49e1-b961-04a3bb3884fc-kube-api-access-lhzm2" (OuterVolumeSpecName: "kube-api-access-lhzm2") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "kube-api-access-lhzm2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.747423 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-config-data" (OuterVolumeSpecName: "config-data") pod "fd564f21-f972-49e1-b961-04a3bb3884fc" (UID: "fd564f21-f972-49e1-b961-04a3bb3884fc"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789179 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789234 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789269 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789285 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789301 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-lhzm2\" (UniqueName: \"kubernetes.io/projected/fd564f21-f972-49e1-b961-04a3bb3884fc-kube-api-access-lhzm2\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789314 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/fd564f21-f972-49e1-b961-04a3bb3884fc-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789327 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/fd564f21-f972-49e1-b961-04a3bb3884fc-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789344 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.789356 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/fd564f21-f972-49e1-b961-04a3bb3884fc-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.803887 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.809938 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.890805 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:35 crc kubenswrapper[4812]: I0131 04:48:35.890858 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.115733 4812 generic.go:334] "Generic (PLEG): container finished" podID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerID="5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202" exitCode=143 Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.116092 4812 generic.go:334] "Generic (PLEG): container finished" podID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerID="441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a" exitCode=143 Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.115795 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd564f21-f972-49e1-b961-04a3bb3884fc","Type":"ContainerDied","Data":"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202"} Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.115818 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.116147 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd564f21-f972-49e1-b961-04a3bb3884fc","Type":"ContainerDied","Data":"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a"} Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.116173 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"fd564f21-f972-49e1-b961-04a3bb3884fc","Type":"ContainerDied","Data":"1e9f723922d70719d5f1e6ca582a036fd65216e4828e85771fa2f8de1e1b4419"} Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.116194 4812 scope.go:117] "RemoveContainer" containerID="5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.149224 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.151757 4812 scope.go:117] "RemoveContainer" containerID="441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.155917 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.171680 4812 scope.go:117] "RemoveContainer" containerID="5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202" Jan 31 04:48:36 crc kubenswrapper[4812]: E0131 04:48:36.173539 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202\": container with ID starting with 5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202 not found: ID does not exist" containerID="5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.173598 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202"} err="failed to get container status \"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202\": rpc error: code = NotFound desc = could not find container \"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202\": container with ID starting with 5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202 not found: ID does not exist" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.173628 4812 scope.go:117] "RemoveContainer" containerID="441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a" Jan 31 04:48:36 crc kubenswrapper[4812]: E0131 04:48:36.175724 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a\": container with ID starting with 441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a not found: ID does not exist" containerID="441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.175787 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a"} err="failed to get container status \"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a\": rpc error: code = NotFound desc = could not find container \"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a\": container with ID starting with 441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a not found: ID does not exist" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.175855 4812 scope.go:117] "RemoveContainer" containerID="5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.176276 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202"} err="failed to get container status \"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202\": rpc error: code = NotFound desc = could not find container \"5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202\": container with ID starting with 5578d3bff5613dd45aa1119f41f66976f4ab3c673a2ad735ddca3e7ae3eb2202 not found: ID does not exist" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.176338 4812 scope.go:117] "RemoveContainer" containerID="441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.176779 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a"} err="failed to get container status \"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a\": rpc error: code = NotFound desc = could not find container \"441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a\": container with ID starting with 441f3abafd97dc1e531ce02e753f9c6ceb475fe04c75a4214be2c4f0a553942a not found: ID does not exist" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.178327 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:36 crc kubenswrapper[4812]: E0131 04:48:36.178596 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-log" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.178615 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-log" Jan 31 04:48:36 crc kubenswrapper[4812]: E0131 04:48:36.178638 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-httpd" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.178647 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-httpd" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.178815 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-httpd" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.178851 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" containerName="glance-log" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.179542 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.185144 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.199111 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296399 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296466 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296501 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296691 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-run\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296770 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296806 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-logs\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296894 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296926 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rwk9c\" (UniqueName: \"kubernetes.io/projected/65f93101-dc86-4191-b695-a32bc59c1f5c-kube-api-access-rwk9c\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296946 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.296979 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.297014 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.297112 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-dev\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.297160 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-sys\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.297185 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.357890 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fd564f21-f972-49e1-b961-04a3bb3884fc" path="/var/lib/kubelet/pods/fd564f21-f972-49e1-b961-04a3bb3884fc/volumes" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.399360 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.399464 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.399536 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.399628 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-run\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.399808 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.399914 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-logs\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400072 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400147 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400161 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400302 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400334 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-rwk9c\" (UniqueName: \"kubernetes.io/projected/65f93101-dc86-4191-b695-a32bc59c1f5c-kube-api-access-rwk9c\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400432 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400510 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400613 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-dev\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400667 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") device mount path \"/mnt/openstack/pv07\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400755 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-sys\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400174 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-run\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400910 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400965 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-dev\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400993 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") device mount path \"/mnt/openstack/pv02\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.400704 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-sys\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.401043 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-logs\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.401008 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.401095 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.401153 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.405898 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-scripts\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.409143 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-config-data\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.429618 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-rwk9c\" (UniqueName: \"kubernetes.io/projected/65f93101-dc86-4191-b695-a32bc59c1f5c-kube-api-access-rwk9c\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.437312 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.439816 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"glance-default-internal-api-0\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.499816 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:36 crc kubenswrapper[4812]: I0131 04:48:36.941584 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:48:36 crc kubenswrapper[4812]: W0131 04:48:36.944762 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod65f93101_dc86_4191_b695_a32bc59c1f5c.slice/crio-182a21555d06f4276bf3db69d574e1d802dedca0ee1fc6894c361453cf79e9f1 WatchSource:0}: Error finding container 182a21555d06f4276bf3db69d574e1d802dedca0ee1fc6894c361453cf79e9f1: Status 404 returned error can't find the container with id 182a21555d06f4276bf3db69d574e1d802dedca0ee1fc6894c361453cf79e9f1 Jan 31 04:48:37 crc kubenswrapper[4812]: I0131 04:48:37.128293 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"65f93101-dc86-4191-b695-a32bc59c1f5c","Type":"ContainerStarted","Data":"182a21555d06f4276bf3db69d574e1d802dedca0ee1fc6894c361453cf79e9f1"} Jan 31 04:48:38 crc kubenswrapper[4812]: I0131 04:48:38.145967 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"65f93101-dc86-4191-b695-a32bc59c1f5c","Type":"ContainerStarted","Data":"d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501"} Jan 31 04:48:38 crc kubenswrapper[4812]: I0131 04:48:38.146389 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"65f93101-dc86-4191-b695-a32bc59c1f5c","Type":"ContainerStarted","Data":"cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37"} Jan 31 04:48:38 crc kubenswrapper[4812]: I0131 04:48:38.172460 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.172433685 podStartE2EDuration="2.172433685s" podCreationTimestamp="2026-01-31 04:48:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:38.166331452 +0000 UTC m=+1326.661353157" watchObservedRunningTime="2026-01-31 04:48:38.172433685 +0000 UTC m=+1326.667455380" Jan 31 04:48:43 crc kubenswrapper[4812]: I0131 04:48:43.769071 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:43 crc kubenswrapper[4812]: I0131 04:48:43.769931 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:43 crc kubenswrapper[4812]: I0131 04:48:43.811476 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:43 crc kubenswrapper[4812]: I0131 04:48:43.835777 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:44 crc kubenswrapper[4812]: I0131 04:48:44.203831 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:44 crc kubenswrapper[4812]: I0131 04:48:44.204267 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:44 crc kubenswrapper[4812]: I0131 04:48:44.338508 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:48:44 crc kubenswrapper[4812]: I0131 04:48:44.338567 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:48:46 crc kubenswrapper[4812]: I0131 04:48:46.011723 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:46 crc kubenswrapper[4812]: I0131 04:48:46.030804 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:48:46 crc kubenswrapper[4812]: I0131 04:48:46.500314 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:46 crc kubenswrapper[4812]: I0131 04:48:46.500756 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:46 crc kubenswrapper[4812]: I0131 04:48:46.554594 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:46 crc kubenswrapper[4812]: I0131 04:48:46.571540 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:47 crc kubenswrapper[4812]: I0131 04:48:47.234149 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:47 crc kubenswrapper[4812]: I0131 04:48:47.234200 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:49 crc kubenswrapper[4812]: I0131 04:48:49.132662 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:49 crc kubenswrapper[4812]: I0131 04:48:49.166779 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.222084 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.223356 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.238489 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.239882 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.253179 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.278274 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.363754 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364045 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364129 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364210 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364309 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364395 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364480 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-run\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364564 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vznb\" (UniqueName: \"kubernetes.io/projected/6b05c672-dbc6-4c18-99c7-1e7b593599fa-kube-api-access-9vznb\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364641 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-scripts\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364708 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-run\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364809 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-scripts\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364911 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.364992 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-dev\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.365076 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.365147 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-config-data\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.365245 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.365340 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.365601 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pkxhf\" (UniqueName: \"kubernetes.io/projected/bc50d989-cc66-4d83-a741-eced5a632d1e-kube-api-access-pkxhf\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.365729 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-dev\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366318 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-sys\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366459 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-sys\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366493 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366518 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366648 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-config-data\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366779 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366818 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366895 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-logs\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.366943 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-logs\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.447172 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.448730 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468336 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-logs\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468379 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468400 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468418 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468438 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468455 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468470 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468485 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-run\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468501 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-9vznb\" (UniqueName: \"kubernetes.io/projected/6b05c672-dbc6-4c18-99c7-1e7b593599fa-kube-api-access-9vznb\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468519 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-scripts\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468537 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-run\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468556 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-scripts\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468571 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-dev\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468588 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468611 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468626 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-config-data\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468643 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468661 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468679 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pkxhf\" (UniqueName: \"kubernetes.io/projected/bc50d989-cc66-4d83-a741-eced5a632d1e-kube-api-access-pkxhf\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468696 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-dev\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468714 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-sys\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468737 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-sys\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468752 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468768 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468795 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-config-data\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468817 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468856 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.468879 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-logs\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.469244 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-logs\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.469285 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-dev\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.469535 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.481072 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-logs\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.481485 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-iscsi\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.481522 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.481697 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.484365 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-sys\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.484499 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-var-locks-brick\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.484913 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.484973 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-sys\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.485014 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-nvme\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.485039 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-run\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.485072 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.485094 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.485116 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-lib-modules\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.485195 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") device mount path \"/mnt/openstack/pv20\"" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.486076 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-run\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.486165 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") device mount path \"/mnt/openstack/pv10\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.486185 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.486365 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-dev\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.486085 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.487018 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-scripts\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.487341 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-httpd-run\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.488041 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.488295 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.490207 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-scripts\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.491086 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-config-data\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.501298 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.506196 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-config-data\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.512111 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.512318 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.518034 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pkxhf\" (UniqueName: \"kubernetes.io/projected/bc50d989-cc66-4d83-a741-eced5a632d1e-kube-api-access-pkxhf\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.518931 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.519276 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vznb\" (UniqueName: \"kubernetes.io/projected/6b05c672-dbc6-4c18-99c7-1e7b593599fa-kube-api-access-9vznb\") pod \"glance-default-external-api-1\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.524802 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"glance-default-external-api-2\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.540458 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.554606 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.570925 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-run\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.570989 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-sys\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571019 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-config-data\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571052 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-run\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571076 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571103 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rb9b\" (UniqueName: \"kubernetes.io/projected/68af4f82-0682-47be-9e94-45a59ba99778-kube-api-access-7rb9b\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571127 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-dev\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571151 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-dev\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571184 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-scripts\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571213 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571240 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-config-data\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571268 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-logs\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571291 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-logs\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571317 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571339 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571362 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-sys\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571382 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571405 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571424 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571449 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bpbrp\" (UniqueName: \"kubernetes.io/projected/baa876b0-65fc-4050-8b54-3855d6f6565a-kube-api-access-bpbrp\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571474 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571497 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571517 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571544 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571565 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571593 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-scripts\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571613 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.571635 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.672832 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-run\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673198 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-sys\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673224 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-config-data\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673247 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-run\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673002 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-run\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673299 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673267 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-lib-modules\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673341 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-7rb9b\" (UniqueName: \"kubernetes.io/projected/68af4f82-0682-47be-9e94-45a59ba99778-kube-api-access-7rb9b\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673361 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-dev\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673383 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-dev\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673411 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-scripts\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673433 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673453 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-config-data\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673475 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-logs\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673491 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-logs\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673512 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673528 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673545 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-sys\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673561 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673579 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673596 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673617 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bpbrp\" (UniqueName: \"kubernetes.io/projected/baa876b0-65fc-4050-8b54-3855d6f6565a-kube-api-access-bpbrp\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673637 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673655 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673670 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673691 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673708 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673728 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-scripts\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673742 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673760 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673830 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.673877 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-run\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674027 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-sys\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674062 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-dev\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674025 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-nvme\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674074 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-dev\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674118 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674145 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674169 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674404 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674833 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-logs\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674895 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-sys\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674906 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-var-locks-brick\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674909 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-iscsi\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.674853 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.675104 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.675133 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-logs\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.675141 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.675251 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.675255 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-httpd-run\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.679290 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-config-data\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.679486 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-scripts\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.680086 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-config-data\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.688234 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-scripts\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.692371 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rb9b\" (UniqueName: \"kubernetes.io/projected/68af4f82-0682-47be-9e94-45a59ba99778-kube-api-access-7rb9b\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.692969 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bpbrp\" (UniqueName: \"kubernetes.io/projected/baa876b0-65fc-4050-8b54-3855d6f6565a-kube-api-access-bpbrp\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.703338 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.704546 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.705134 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-internal-api-2\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.705689 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-1\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.767134 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:48:52 crc kubenswrapper[4812]: I0131 04:48:52.941408 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.005567 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:48:53 crc kubenswrapper[4812]: W0131 04:48:53.005982 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbc50d989_cc66_4d83_a741_eced5a632d1e.slice/crio-3a7173e982c244171df48c4b866b2b8eaec9d05796b38074ac72e7a25f8a8f50 WatchSource:0}: Error finding container 3a7173e982c244171df48c4b866b2b8eaec9d05796b38074ac72e7a25f8a8f50: Status 404 returned error can't find the container with id 3a7173e982c244171df48c4b866b2b8eaec9d05796b38074ac72e7a25f8a8f50 Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.096892 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.203075 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.281661 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"6b05c672-dbc6-4c18-99c7-1e7b593599fa","Type":"ContainerStarted","Data":"34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd"} Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.281708 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"6b05c672-dbc6-4c18-99c7-1e7b593599fa","Type":"ContainerStarted","Data":"05b39e087a2dd2dde2a74ff8b0eda93ba633b81cd57207d26d894a0fa994b3d4"} Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.283016 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"bc50d989-cc66-4d83-a741-eced5a632d1e","Type":"ContainerStarted","Data":"0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094"} Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.283055 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"bc50d989-cc66-4d83-a741-eced5a632d1e","Type":"ContainerStarted","Data":"3a7173e982c244171df48c4b866b2b8eaec9d05796b38074ac72e7a25f8a8f50"} Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.283920 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"baa876b0-65fc-4050-8b54-3855d6f6565a","Type":"ContainerStarted","Data":"54e11f6b656c6269acb5f324c891b85533c658f571d15f5af11332b9c2c6a07e"} Jan 31 04:48:53 crc kubenswrapper[4812]: I0131 04:48:53.344158 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:48:53 crc kubenswrapper[4812]: W0131 04:48:53.353959 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68af4f82_0682_47be_9e94_45a59ba99778.slice/crio-d31c314743d4e0d343ca6c08c32ff5a2775ea24cbe54822178804c9d48a1b612 WatchSource:0}: Error finding container d31c314743d4e0d343ca6c08c32ff5a2775ea24cbe54822178804c9d48a1b612: Status 404 returned error can't find the container with id d31c314743d4e0d343ca6c08c32ff5a2775ea24cbe54822178804c9d48a1b612 Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.301364 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"baa876b0-65fc-4050-8b54-3855d6f6565a","Type":"ContainerStarted","Data":"be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.302164 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"baa876b0-65fc-4050-8b54-3855d6f6565a","Type":"ContainerStarted","Data":"07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.304636 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"6b05c672-dbc6-4c18-99c7-1e7b593599fa","Type":"ContainerStarted","Data":"221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.308099 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"68af4f82-0682-47be-9e94-45a59ba99778","Type":"ContainerStarted","Data":"ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.308197 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"68af4f82-0682-47be-9e94-45a59ba99778","Type":"ContainerStarted","Data":"21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.308260 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"68af4f82-0682-47be-9e94-45a59ba99778","Type":"ContainerStarted","Data":"d31c314743d4e0d343ca6c08c32ff5a2775ea24cbe54822178804c9d48a1b612"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.311292 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"bc50d989-cc66-4d83-a741-eced5a632d1e","Type":"ContainerStarted","Data":"5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed"} Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.343504 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=3.3434705940000002 podStartE2EDuration="3.343470594s" podCreationTimestamp="2026-01-31 04:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:54.337098543 +0000 UTC m=+1342.832120268" watchObservedRunningTime="2026-01-31 04:48:54.343470594 +0000 UTC m=+1342.838492299" Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.378982 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-2" podStartSLOduration=3.378952807 podStartE2EDuration="3.378952807s" podCreationTimestamp="2026-01-31 04:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:54.375340939 +0000 UTC m=+1342.870362634" watchObservedRunningTime="2026-01-31 04:48:54.378952807 +0000 UTC m=+1342.873974512" Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.412598 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-2" podStartSLOduration=3.412580359 podStartE2EDuration="3.412580359s" podCreationTimestamp="2026-01-31 04:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:54.410674197 +0000 UTC m=+1342.905695902" watchObservedRunningTime="2026-01-31 04:48:54.412580359 +0000 UTC m=+1342.907602034" Jan 31 04:48:54 crc kubenswrapper[4812]: I0131 04:48:54.452102 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=3.452083429 podStartE2EDuration="3.452083429s" podCreationTimestamp="2026-01-31 04:48:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:48:54.44432159 +0000 UTC m=+1342.939343285" watchObservedRunningTime="2026-01-31 04:48:54.452083429 +0000 UTC m=+1342.947105114" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.540719 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.541448 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.554963 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.555025 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.578078 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.599584 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.617793 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.619546 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.767993 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.768235 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.801224 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.830793 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.942034 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.942077 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.983431 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:02 crc kubenswrapper[4812]: I0131 04:49:02.992221 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406152 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406194 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406208 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406221 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406233 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406244 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406258 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:03 crc kubenswrapper[4812]: I0131 04:49:03.406271 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.350411 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.390812 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.401575 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.424188 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.424214 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.424658 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.425345 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.425361 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.446980 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.459529 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.517346 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.524868 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:05 crc kubenswrapper[4812]: I0131 04:49:05.596432 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:06 crc kubenswrapper[4812]: I0131 04:49:06.064213 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:49:06 crc kubenswrapper[4812]: I0131 04:49:06.076606 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:49:06 crc kubenswrapper[4812]: I0131 04:49:06.204683 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:49:06 crc kubenswrapper[4812]: I0131 04:49:06.210687 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436127 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-log" containerID="cri-o://0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436468 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-httpd" containerID="cri-o://221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436200 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-2" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-httpd" containerID="cri-o://5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436381 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-log" containerID="cri-o://34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436634 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-log" containerID="cri-o://21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436672 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-2" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-httpd" containerID="cri-o://ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436710 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-log" containerID="cri-o://07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.436797 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-httpd" containerID="cri-o://be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95" gracePeriod=30 Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.447043 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.127:9292/healthcheck\": EOF" Jan 31 04:49:07 crc kubenswrapper[4812]: I0131 04:49:07.447177 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.127:9292/healthcheck\": EOF" Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.465594 4812 generic.go:334] "Generic (PLEG): container finished" podID="68af4f82-0682-47be-9e94-45a59ba99778" containerID="21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c" exitCode=143 Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.465686 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"68af4f82-0682-47be-9e94-45a59ba99778","Type":"ContainerDied","Data":"21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c"} Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.468416 4812 generic.go:334] "Generic (PLEG): container finished" podID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerID="0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094" exitCode=143 Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.468486 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"bc50d989-cc66-4d83-a741-eced5a632d1e","Type":"ContainerDied","Data":"0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094"} Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.471639 4812 generic.go:334] "Generic (PLEG): container finished" podID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerID="07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408" exitCode=143 Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.471706 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"baa876b0-65fc-4050-8b54-3855d6f6565a","Type":"ContainerDied","Data":"07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408"} Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.474405 4812 generic.go:334] "Generic (PLEG): container finished" podID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerID="34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd" exitCode=143 Jan 31 04:49:08 crc kubenswrapper[4812]: I0131 04:49:08.474441 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"6b05c672-dbc6-4c18-99c7-1e7b593599fa","Type":"ContainerDied","Data":"34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.084526 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.089537 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.096645 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.159603 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243206 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-sys\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243245 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-lib-modules\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243281 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9vznb\" (UniqueName: \"kubernetes.io/projected/6b05c672-dbc6-4c18-99c7-1e7b593599fa-kube-api-access-9vznb\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243318 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243326 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-sys" (OuterVolumeSpecName: "sys") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243343 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-httpd-run\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243344 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243362 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-nvme\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243392 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-logs\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243419 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-scripts\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243437 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-dev\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243461 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-httpd-run\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243484 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-nvme\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243502 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-sys\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243525 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-var-locks-brick\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243549 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-config-data\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243568 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-dev\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243586 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-sys\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243610 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-scripts\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243636 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-nvme\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243658 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-lib-modules\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243677 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243696 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-run\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243722 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243733 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rb9b\" (UniqueName: \"kubernetes.io/projected/68af4f82-0682-47be-9e94-45a59ba99778-kube-api-access-7rb9b\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243754 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-config-data\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243757 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243774 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-httpd-run\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243780 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243796 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243814 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-run\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243856 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243876 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-sys\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243894 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-run\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243947 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pkxhf\" (UniqueName: \"kubernetes.io/projected/bc50d989-cc66-4d83-a741-eced5a632d1e-kube-api-access-pkxhf\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243950 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-sys" (OuterVolumeSpecName: "sys") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.243966 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-dev\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244001 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-dev\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244022 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-iscsi\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244057 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244052 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-run" (OuterVolumeSpecName: "run") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244075 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-logs" (OuterVolumeSpecName: "logs") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244079 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-var-locks-brick\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244111 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244116 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-iscsi\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244142 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-run" (OuterVolumeSpecName: "run") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244145 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-run\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244166 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-run" (OuterVolumeSpecName: "run") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244173 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bpbrp\" (UniqueName: \"kubernetes.io/projected/baa876b0-65fc-4050-8b54-3855d6f6565a-kube-api-access-bpbrp\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244192 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244203 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-config-data\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244216 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-sys" (OuterVolumeSpecName: "sys") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244225 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-var-locks-brick\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244238 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244244 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-scripts\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244264 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-dev" (OuterVolumeSpecName: "dev") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244270 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-var-locks-brick\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244294 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-config-data\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244312 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244332 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-scripts\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244352 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-logs\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244376 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-nvme\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244384 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244431 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244407 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-iscsi\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244461 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-logs\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244483 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-iscsi\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244505 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-lib-modules\") pod \"bc50d989-cc66-4d83-a741-eced5a632d1e\" (UID: \"bc50d989-cc66-4d83-a741-eced5a632d1e\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244511 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244528 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-lib-modules\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244553 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") pod \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\" (UID: \"6b05c672-dbc6-4c18-99c7-1e7b593599fa\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244576 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-httpd-run\") pod \"68af4f82-0682-47be-9e94-45a59ba99778\" (UID: \"68af4f82-0682-47be-9e94-45a59ba99778\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244603 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-logs\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.244624 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"baa876b0-65fc-4050-8b54-3855d6f6565a\" (UID: \"baa876b0-65fc-4050-8b54-3855d6f6565a\") " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245086 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245100 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245110 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245120 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245130 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245140 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245151 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245159 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245170 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245179 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245189 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245198 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245209 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245220 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245229 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/bc50d989-cc66-4d83-a741-eced5a632d1e-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245239 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245248 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245260 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245135 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-dev" (OuterVolumeSpecName: "dev") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.245158 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.246618 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.246702 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.249418 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-scripts" (OuterVolumeSpecName: "scripts") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.249570 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.249803 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance-cache") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250068 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250287 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-logs" (OuterVolumeSpecName: "logs") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250533 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-logs" (OuterVolumeSpecName: "logs") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250720 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250774 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250804 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-sys" (OuterVolumeSpecName: "sys") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.250849 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-run" (OuterVolumeSpecName: "run") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.254350 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/68af4f82-0682-47be-9e94-45a59ba99778-kube-api-access-7rb9b" (OuterVolumeSpecName: "kube-api-access-7rb9b") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "kube-api-access-7rb9b". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.254439 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.255649 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.255655 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage10-crc" (OuterVolumeSpecName: "glance") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "local-storage10-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.255647 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage20-crc" (OuterVolumeSpecName: "glance-cache") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "local-storage20-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.255691 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.255704 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256104 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-logs" (OuterVolumeSpecName: "logs") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256181 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6b05c672-dbc6-4c18-99c7-1e7b593599fa-kube-api-access-9vznb" (OuterVolumeSpecName: "kube-api-access-9vznb") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "kube-api-access-9vznb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256200 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-dev" (OuterVolumeSpecName: "dev") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256233 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256263 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-dev" (OuterVolumeSpecName: "dev") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256291 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256296 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bc50d989-cc66-4d83-a741-eced5a632d1e-kube-api-access-pkxhf" (OuterVolumeSpecName: "kube-api-access-pkxhf") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "kube-api-access-pkxhf". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.256349 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-scripts" (OuterVolumeSpecName: "scripts") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.257491 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-scripts" (OuterVolumeSpecName: "scripts") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.257979 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.258249 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.259182 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/baa876b0-65fc-4050-8b54-3855d6f6565a-kube-api-access-bpbrp" (OuterVolumeSpecName: "kube-api-access-bpbrp") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "kube-api-access-bpbrp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.259758 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-scripts" (OuterVolumeSpecName: "scripts") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.287619 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-config-data" (OuterVolumeSpecName: "config-data") pod "68af4f82-0682-47be-9e94-45a59ba99778" (UID: "68af4f82-0682-47be-9e94-45a59ba99778"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.295142 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-config-data" (OuterVolumeSpecName: "config-data") pod "baa876b0-65fc-4050-8b54-3855d6f6565a" (UID: "baa876b0-65fc-4050-8b54-3855d6f6565a"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.300001 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-config-data" (OuterVolumeSpecName: "config-data") pod "6b05c672-dbc6-4c18-99c7-1e7b593599fa" (UID: "6b05c672-dbc6-4c18-99c7-1e7b593599fa"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.303496 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-config-data" (OuterVolumeSpecName: "config-data") pod "bc50d989-cc66-4d83-a741-eced5a632d1e" (UID: "bc50d989-cc66-4d83-a741-eced5a632d1e"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346098 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346125 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346156 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346164 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346173 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/baa876b0-65fc-4050-8b54-3855d6f6565a-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346186 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346194 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-9vznb\" (UniqueName: \"kubernetes.io/projected/6b05c672-dbc6-4c18-99c7-1e7b593599fa-kube-api-access-9vznb\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346208 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346216 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346224 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346231 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/bc50d989-cc66-4d83-a741-eced5a632d1e-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346239 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346248 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346255 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346267 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346276 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346284 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7rb9b\" (UniqueName: \"kubernetes.io/projected/68af4f82-0682-47be-9e94-45a59ba99778-kube-api-access-7rb9b\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346298 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346310 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346318 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346325 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346333 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pkxhf\" (UniqueName: \"kubernetes.io/projected/bc50d989-cc66-4d83-a741-eced5a632d1e-kube-api-access-pkxhf\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346343 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346351 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/6b05c672-dbc6-4c18-99c7-1e7b593599fa-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346359 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346601 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346610 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bpbrp\" (UniqueName: \"kubernetes.io/projected/baa876b0-65fc-4050-8b54-3855d6f6565a-kube-api-access-bpbrp\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346619 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346626 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/68af4f82-0682-47be-9e94-45a59ba99778-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346635 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/baa876b0-65fc-4050-8b54-3855d6f6565a-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346643 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/6b05c672-dbc6-4c18-99c7-1e7b593599fa-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346650 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346674 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" " Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346685 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/68af4f82-0682-47be-9e94-45a59ba99778-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346695 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/6b05c672-dbc6-4c18-99c7-1e7b593599fa-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346705 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/baa876b0-65fc-4050-8b54-3855d6f6565a-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346716 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/bc50d989-cc66-4d83-a741-eced5a632d1e-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.346727 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/68af4f82-0682-47be-9e94-45a59ba99778-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.361507 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage10-crc" (UniqueName: "kubernetes.io/local-volume/local-storage10-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.366867 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.369893 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage20-crc" (UniqueName: "kubernetes.io/local-volume/local-storage20-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.371307 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.371971 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.373634 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.375140 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.377765 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447871 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447901 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447911 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447920 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage20-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage20-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447928 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage10-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage10-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447936 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447944 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.447952 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.500028 4812 generic.go:334] "Generic (PLEG): container finished" podID="68af4f82-0682-47be-9e94-45a59ba99778" containerID="ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467" exitCode=0 Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.500094 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"68af4f82-0682-47be-9e94-45a59ba99778","Type":"ContainerDied","Data":"ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.500141 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-2" event={"ID":"68af4f82-0682-47be-9e94-45a59ba99778","Type":"ContainerDied","Data":"d31c314743d4e0d343ca6c08c32ff5a2775ea24cbe54822178804c9d48a1b612"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.500159 4812 scope.go:117] "RemoveContainer" containerID="ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.500602 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-2" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.519189 4812 generic.go:334] "Generic (PLEG): container finished" podID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerID="5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed" exitCode=0 Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.519296 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"bc50d989-cc66-4d83-a741-eced5a632d1e","Type":"ContainerDied","Data":"5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.519328 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-2" event={"ID":"bc50d989-cc66-4d83-a741-eced5a632d1e","Type":"ContainerDied","Data":"3a7173e982c244171df48c4b866b2b8eaec9d05796b38074ac72e7a25f8a8f50"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.519638 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-2" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.538737 4812 generic.go:334] "Generic (PLEG): container finished" podID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerID="be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95" exitCode=0 Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.538817 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"baa876b0-65fc-4050-8b54-3855d6f6565a","Type":"ContainerDied","Data":"be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.538857 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"baa876b0-65fc-4050-8b54-3855d6f6565a","Type":"ContainerDied","Data":"54e11f6b656c6269acb5f324c891b85533c658f571d15f5af11332b9c2c6a07e"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.538930 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.565387 4812 generic.go:334] "Generic (PLEG): container finished" podID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerID="221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586" exitCode=0 Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.565457 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"6b05c672-dbc6-4c18-99c7-1e7b593599fa","Type":"ContainerDied","Data":"221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.565516 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"6b05c672-dbc6-4c18-99c7-1e7b593599fa","Type":"ContainerDied","Data":"05b39e087a2dd2dde2a74ff8b0eda93ba633b81cd57207d26d894a0fa994b3d4"} Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.565627 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.572968 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.576825 4812 scope.go:117] "RemoveContainer" containerID="21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.584997 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-2"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.612876 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.629609 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-2"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.643014 4812 scope.go:117] "RemoveContainer" containerID="ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.643594 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467\": container with ID starting with ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467 not found: ID does not exist" containerID="ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.643644 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467"} err="failed to get container status \"ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467\": rpc error: code = NotFound desc = could not find container \"ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467\": container with ID starting with ccebcb3e8d58dabe24ed730957d3e20b542ab35598859ebf069c7537a494d467 not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.643676 4812 scope.go:117] "RemoveContainer" containerID="21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.643973 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c\": container with ID starting with 21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c not found: ID does not exist" containerID="21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.644009 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c"} err="failed to get container status \"21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c\": rpc error: code = NotFound desc = could not find container \"21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c\": container with ID starting with 21ceb4cdae2f727d2a806c765affbcc0f96ad308caa805690d266ddc2204c18c not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.644035 4812 scope.go:117] "RemoveContainer" containerID="5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.662922 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.688403 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.692742 4812 scope.go:117] "RemoveContainer" containerID="0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.696875 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.708569 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.745625 4812 scope.go:117] "RemoveContainer" containerID="5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.746157 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed\": container with ID starting with 5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed not found: ID does not exist" containerID="5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.746189 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed"} err="failed to get container status \"5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed\": rpc error: code = NotFound desc = could not find container \"5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed\": container with ID starting with 5de660726b8407c38cbea7590a2663cdfd3ffe63f694115d2ce7455d9dbda6ed not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.746210 4812 scope.go:117] "RemoveContainer" containerID="0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.746547 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094\": container with ID starting with 0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094 not found: ID does not exist" containerID="0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.746585 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094"} err="failed to get container status \"0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094\": rpc error: code = NotFound desc = could not find container \"0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094\": container with ID starting with 0de1ee56a5ef1a3bd4c00d33c77e31bb72feef7dd51b03857a2d4191e5217094 not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.746616 4812 scope.go:117] "RemoveContainer" containerID="be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.767183 4812 scope.go:117] "RemoveContainer" containerID="07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.784900 4812 scope.go:117] "RemoveContainer" containerID="be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.785271 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95\": container with ID starting with be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95 not found: ID does not exist" containerID="be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.785302 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95"} err="failed to get container status \"be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95\": rpc error: code = NotFound desc = could not find container \"be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95\": container with ID starting with be88d6bf2491c33a7eb3c491a6967709ed9f16200916dc5120e358758344ba95 not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.785324 4812 scope.go:117] "RemoveContainer" containerID="07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.785751 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408\": container with ID starting with 07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408 not found: ID does not exist" containerID="07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.785854 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408"} err="failed to get container status \"07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408\": rpc error: code = NotFound desc = could not find container \"07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408\": container with ID starting with 07a304da947715f3acc5ceb44b8a66e26445305740ce38f21c5bdef4df9dc408 not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.785889 4812 scope.go:117] "RemoveContainer" containerID="221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.803351 4812 scope.go:117] "RemoveContainer" containerID="34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.817932 4812 scope.go:117] "RemoveContainer" containerID="221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.818312 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586\": container with ID starting with 221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586 not found: ID does not exist" containerID="221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.818339 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586"} err="failed to get container status \"221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586\": rpc error: code = NotFound desc = could not find container \"221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586\": container with ID starting with 221255980f0444e51bbf989ad461fab3c6216745dfaa021c4526d9a9ebfe2586 not found: ID does not exist" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.818359 4812 scope.go:117] "RemoveContainer" containerID="34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd" Jan 31 04:49:11 crc kubenswrapper[4812]: E0131 04:49:11.818756 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd\": container with ID starting with 34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd not found: ID does not exist" containerID="34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd" Jan 31 04:49:11 crc kubenswrapper[4812]: I0131 04:49:11.818797 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd"} err="failed to get container status \"34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd\": rpc error: code = NotFound desc = could not find container \"34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd\": container with ID starting with 34d772149fa8d023c90b47e08eed37bc3a9090b17afb0f1224dca736aba61ebd not found: ID does not exist" Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.349584 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="68af4f82-0682-47be-9e94-45a59ba99778" path="/var/lib/kubelet/pods/68af4f82-0682-47be-9e94-45a59ba99778/volumes" Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.350787 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" path="/var/lib/kubelet/pods/6b05c672-dbc6-4c18-99c7-1e7b593599fa/volumes" Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.351401 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" path="/var/lib/kubelet/pods/baa876b0-65fc-4050-8b54-3855d6f6565a/volumes" Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.352524 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" path="/var/lib/kubelet/pods/bc50d989-cc66-4d83-a741-eced5a632d1e/volumes" Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.353020 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.353253 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-log" containerID="cri-o://f8b34c40f3b1dc3dcbd37f9daa024fefcaaa45604e5ccb593eb3a139f1d562e9" gracePeriod=30 Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.353344 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-httpd" containerID="cri-o://ce6ea52ac92effcabb49d1ceda7589687801b9c4abd85f4371c43a64c444e466" gracePeriod=30 Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.579695 4812 generic.go:334] "Generic (PLEG): container finished" podID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerID="f8b34c40f3b1dc3dcbd37f9daa024fefcaaa45604e5ccb593eb3a139f1d562e9" exitCode=143 Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.579815 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"b2c47516-b290-4a70-a899-c0ffa0f8e3b9","Type":"ContainerDied","Data":"f8b34c40f3b1dc3dcbd37f9daa024fefcaaa45604e5ccb593eb3a139f1d562e9"} Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.973801 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.974104 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-log" containerID="cri-o://d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501" gracePeriod=30 Jan 31 04:49:12 crc kubenswrapper[4812]: I0131 04:49:12.974393 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-httpd" containerID="cri-o://cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37" gracePeriod=30 Jan 31 04:49:13 crc kubenswrapper[4812]: I0131 04:49:13.592952 4812 generic.go:334] "Generic (PLEG): container finished" podID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerID="d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501" exitCode=143 Jan 31 04:49:13 crc kubenswrapper[4812]: I0131 04:49:13.593066 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"65f93101-dc86-4191-b695-a32bc59c1f5c","Type":"ContainerDied","Data":"d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501"} Jan 31 04:49:14 crc kubenswrapper[4812]: I0131 04:49:14.338180 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:49:14 crc kubenswrapper[4812]: I0131 04:49:14.338585 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:49:15 crc kubenswrapper[4812]: I0131 04:49:15.618012 4812 generic.go:334] "Generic (PLEG): container finished" podID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerID="ce6ea52ac92effcabb49d1ceda7589687801b9c4abd85f4371c43a64c444e466" exitCode=0 Jan 31 04:49:15 crc kubenswrapper[4812]: I0131 04:49:15.618129 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"b2c47516-b290-4a70-a899-c0ffa0f8e3b9","Type":"ContainerDied","Data":"ce6ea52ac92effcabb49d1ceda7589687801b9c4abd85f4371c43a64c444e466"} Jan 31 04:49:15 crc kubenswrapper[4812]: I0131 04:49:15.912021 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.013762 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-httpd-run\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.013835 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jfchz\" (UniqueName: \"kubernetes.io/projected/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-kube-api-access-jfchz\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.013916 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.013934 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-sys\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.013978 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-scripts\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.013999 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-run\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014029 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-nvme\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014051 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-iscsi\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014084 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014126 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014165 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-dev\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014180 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-run" (OuterVolumeSpecName: "run") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014203 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-sys" (OuterVolumeSpecName: "sys") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014209 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-logs\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014232 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-var-locks-brick\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014253 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-lib-modules\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014280 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-config-data\") pod \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\" (UID: \"b2c47516-b290-4a70-a899-c0ffa0f8e3b9\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014684 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014706 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014718 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014793 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-dev" (OuterVolumeSpecName: "dev") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014874 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.014904 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.019979 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.020431 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-logs" (OuterVolumeSpecName: "logs") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.020469 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.025335 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-scripts" (OuterVolumeSpecName: "scripts") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.027279 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-kube-api-access-jfchz" (OuterVolumeSpecName: "kube-api-access-jfchz") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "kube-api-access-jfchz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.029703 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.041984 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.095026 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-config-data" (OuterVolumeSpecName: "config-data") pod "b2c47516-b290-4a70-a899-c0ffa0f8e3b9" (UID: "b2c47516-b290-4a70-a899-c0ffa0f8e3b9"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117100 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jfchz\" (UniqueName: \"kubernetes.io/projected/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-kube-api-access-jfchz\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117172 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117186 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117199 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117211 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117227 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117239 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117250 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117261 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117271 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.117282 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/b2c47516-b290-4a70-a899-c0ffa0f8e3b9-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.138661 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.157028 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.218960 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.219271 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.392675 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523509 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-run\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523575 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-config-data\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523604 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-run" (OuterVolumeSpecName: "run") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523621 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-sys\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523670 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-sys" (OuterVolumeSpecName: "sys") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523698 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-var-locks-brick\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523732 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-nvme\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523773 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-dev\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523819 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523819 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523817 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523883 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-dev" (OuterVolumeSpecName: "dev") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.523892 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rwk9c\" (UniqueName: \"kubernetes.io/projected/65f93101-dc86-4191-b695-a32bc59c1f5c-kube-api-access-rwk9c\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524080 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-httpd-run\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524148 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-logs\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524219 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-scripts\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524275 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-iscsi\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524339 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-lib-modules\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524407 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") pod \"65f93101-dc86-4191-b695-a32bc59c1f5c\" (UID: \"65f93101-dc86-4191-b695-a32bc59c1f5c\") " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524435 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524469 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524487 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.524600 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-logs" (OuterVolumeSpecName: "logs") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525208 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525245 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/65f93101-dc86-4191-b695-a32bc59c1f5c-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525269 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525293 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525315 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525337 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525358 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525383 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.525405 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/65f93101-dc86-4191-b695-a32bc59c1f5c-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.527256 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-scripts" (OuterVolumeSpecName: "scripts") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.527293 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65f93101-dc86-4191-b695-a32bc59c1f5c-kube-api-access-rwk9c" (OuterVolumeSpecName: "kube-api-access-rwk9c") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "kube-api-access-rwk9c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.527368 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage07-crc" (OuterVolumeSpecName: "glance") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "local-storage07-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.528485 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage02-crc" (OuterVolumeSpecName: "glance-cache") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "local-storage02-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.569393 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-config-data" (OuterVolumeSpecName: "config-data") pod "65f93101-dc86-4191-b695-a32bc59c1f5c" (UID: "65f93101-dc86-4191-b695-a32bc59c1f5c"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.627099 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.627146 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-rwk9c\" (UniqueName: \"kubernetes.io/projected/65f93101-dc86-4191-b695-a32bc59c1f5c-kube-api-access-rwk9c\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.627166 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.627193 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" " Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.627212 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/65f93101-dc86-4191-b695-a32bc59c1f5c-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.629381 4812 generic.go:334] "Generic (PLEG): container finished" podID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerID="cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37" exitCode=0 Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.629450 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"65f93101-dc86-4191-b695-a32bc59c1f5c","Type":"ContainerDied","Data":"cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37"} Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.629482 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"65f93101-dc86-4191-b695-a32bc59c1f5c","Type":"ContainerDied","Data":"182a21555d06f4276bf3db69d574e1d802dedca0ee1fc6894c361453cf79e9f1"} Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.629507 4812 scope.go:117] "RemoveContainer" containerID="cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.629639 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.644029 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage02-crc" (UniqueName: "kubernetes.io/local-volume/local-storage02-crc") on node "crc" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.644705 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"b2c47516-b290-4a70-a899-c0ffa0f8e3b9","Type":"ContainerDied","Data":"7293e299a5b755b3863563b98eea88c6ca814acd49ef123ecc796f95fbd6f18d"} Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.644891 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.654878 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage07-crc" (UniqueName: "kubernetes.io/local-volume/local-storage07-crc") on node "crc" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.671221 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.676436 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.681356 4812 scope.go:117] "RemoveContainer" containerID="d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.690918 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.693665 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.702036 4812 scope.go:117] "RemoveContainer" containerID="cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37" Jan 31 04:49:16 crc kubenswrapper[4812]: E0131 04:49:16.702644 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37\": container with ID starting with cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37 not found: ID does not exist" containerID="cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.702709 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37"} err="failed to get container status \"cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37\": rpc error: code = NotFound desc = could not find container \"cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37\": container with ID starting with cd0b03094dc0f35ef3e088a41f9673f0a1181fb9a46e5770209a2f246ca8ad37 not found: ID does not exist" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.702765 4812 scope.go:117] "RemoveContainer" containerID="d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501" Jan 31 04:49:16 crc kubenswrapper[4812]: E0131 04:49:16.703496 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501\": container with ID starting with d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501 not found: ID does not exist" containerID="d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.703536 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501"} err="failed to get container status \"d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501\": rpc error: code = NotFound desc = could not find container \"d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501\": container with ID starting with d171e6cfbbeb4b420d8d70a7862d8556f410d3f2be09a6fdfe93cbae00ccc501 not found: ID does not exist" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.703562 4812 scope.go:117] "RemoveContainer" containerID="ce6ea52ac92effcabb49d1ceda7589687801b9c4abd85f4371c43a64c444e466" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.726972 4812 scope.go:117] "RemoveContainer" containerID="f8b34c40f3b1dc3dcbd37f9daa024fefcaaa45604e5ccb593eb3a139f1d562e9" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.728297 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage07-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage07-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:16 crc kubenswrapper[4812]: I0131 04:49:16.728328 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage02-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage02-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.899176 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p4bcx"] Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.905658 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-p4bcx"] Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.945936 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance2b13-account-delete-26hxs"] Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946269 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946292 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946307 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946316 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946326 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946333 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946353 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946360 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946376 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946383 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946393 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946400 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946411 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946420 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946439 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946448 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946460 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946467 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946478 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946484 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946496 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946503 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: E0131 04:49:17.946514 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946522 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946668 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946685 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946702 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946714 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946724 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946737 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946745 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="baa876b0-65fc-4050-8b54-3855d6f6565a" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946756 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946766 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="68af4f82-0682-47be-9e94-45a59ba99778" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946777 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bc50d989-cc66-4d83-a741-eced5a632d1e" containerName="glance-log" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946786 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="6b05c672-dbc6-4c18-99c7-1e7b593599fa" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.946797 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" containerName="glance-httpd" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.947374 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:17 crc kubenswrapper[4812]: I0131 04:49:17.960541 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance2b13-account-delete-26hxs"] Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.046972 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673e69c-fa20-4de9-96ce-db43cbb65482-operator-scripts\") pod \"glance2b13-account-delete-26hxs\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.047106 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d8m2w\" (UniqueName: \"kubernetes.io/projected/5673e69c-fa20-4de9-96ce-db43cbb65482-kube-api-access-d8m2w\") pod \"glance2b13-account-delete-26hxs\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.148144 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673e69c-fa20-4de9-96ce-db43cbb65482-operator-scripts\") pod \"glance2b13-account-delete-26hxs\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.148228 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-d8m2w\" (UniqueName: \"kubernetes.io/projected/5673e69c-fa20-4de9-96ce-db43cbb65482-kube-api-access-d8m2w\") pod \"glance2b13-account-delete-26hxs\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.148943 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673e69c-fa20-4de9-96ce-db43cbb65482-operator-scripts\") pod \"glance2b13-account-delete-26hxs\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.178697 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-d8m2w\" (UniqueName: \"kubernetes.io/projected/5673e69c-fa20-4de9-96ce-db43cbb65482-kube-api-access-d8m2w\") pod \"glance2b13-account-delete-26hxs\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.270581 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.349930 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0661a9af-d008-487a-b40e-081859d7aa65" path="/var/lib/kubelet/pods/0661a9af-d008-487a-b40e-081859d7aa65/volumes" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.351361 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65f93101-dc86-4191-b695-a32bc59c1f5c" path="/var/lib/kubelet/pods/65f93101-dc86-4191-b695-a32bc59c1f5c/volumes" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.352086 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2c47516-b290-4a70-a899-c0ffa0f8e3b9" path="/var/lib/kubelet/pods/b2c47516-b290-4a70-a899-c0ffa0f8e3b9/volumes" Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.538257 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance2b13-account-delete-26hxs"] Jan 31 04:49:18 crc kubenswrapper[4812]: I0131 04:49:18.669655 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" event={"ID":"5673e69c-fa20-4de9-96ce-db43cbb65482","Type":"ContainerStarted","Data":"a139034078e1dd8f7893172276cef73eb1fbf087cb86c2e5559e0aedbbe2789a"} Jan 31 04:49:19 crc kubenswrapper[4812]: I0131 04:49:19.684713 4812 generic.go:334] "Generic (PLEG): container finished" podID="5673e69c-fa20-4de9-96ce-db43cbb65482" containerID="dadee4a8340e3b7724a378c5015d9216f0e59911788a87794ccfb28deee3ec3c" exitCode=0 Jan 31 04:49:19 crc kubenswrapper[4812]: I0131 04:49:19.684923 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" event={"ID":"5673e69c-fa20-4de9-96ce-db43cbb65482","Type":"ContainerDied","Data":"dadee4a8340e3b7724a378c5015d9216f0e59911788a87794ccfb28deee3ec3c"} Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.061746 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-qq8r6"] Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.063555 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.067272 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.090588 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qq8r6"] Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.196400 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-d8m2w\" (UniqueName: \"kubernetes.io/projected/5673e69c-fa20-4de9-96ce-db43cbb65482-kube-api-access-d8m2w\") pod \"5673e69c-fa20-4de9-96ce-db43cbb65482\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.196530 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673e69c-fa20-4de9-96ce-db43cbb65482-operator-scripts\") pod \"5673e69c-fa20-4de9-96ce-db43cbb65482\" (UID: \"5673e69c-fa20-4de9-96ce-db43cbb65482\") " Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.196762 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-utilities\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.196810 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-catalog-content\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.196875 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vws2n\" (UniqueName: \"kubernetes.io/projected/46aed756-2909-41b0-ba42-723012f5d95a-kube-api-access-vws2n\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.197351 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/5673e69c-fa20-4de9-96ce-db43cbb65482-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "5673e69c-fa20-4de9-96ce-db43cbb65482" (UID: "5673e69c-fa20-4de9-96ce-db43cbb65482"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.215066 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5673e69c-fa20-4de9-96ce-db43cbb65482-kube-api-access-d8m2w" (OuterVolumeSpecName: "kube-api-access-d8m2w") pod "5673e69c-fa20-4de9-96ce-db43cbb65482" (UID: "5673e69c-fa20-4de9-96ce-db43cbb65482"). InnerVolumeSpecName "kube-api-access-d8m2w". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.297976 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-catalog-content\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.298054 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vws2n\" (UniqueName: \"kubernetes.io/projected/46aed756-2909-41b0-ba42-723012f5d95a-kube-api-access-vws2n\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.298156 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-utilities\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.298205 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/5673e69c-fa20-4de9-96ce-db43cbb65482-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.298220 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-d8m2w\" (UniqueName: \"kubernetes.io/projected/5673e69c-fa20-4de9-96ce-db43cbb65482-kube-api-access-d8m2w\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.298639 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-utilities\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.298914 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-catalog-content\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.316764 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vws2n\" (UniqueName: \"kubernetes.io/projected/46aed756-2909-41b0-ba42-723012f5d95a-kube-api-access-vws2n\") pod \"redhat-operators-qq8r6\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.380646 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.701686 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" event={"ID":"5673e69c-fa20-4de9-96ce-db43cbb65482","Type":"ContainerDied","Data":"a139034078e1dd8f7893172276cef73eb1fbf087cb86c2e5559e0aedbbe2789a"} Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.701962 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a139034078e1dd8f7893172276cef73eb1fbf087cb86c2e5559e0aedbbe2789a" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.701745 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance2b13-account-delete-26hxs" Jan 31 04:49:21 crc kubenswrapper[4812]: I0131 04:49:21.789989 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-qq8r6"] Jan 31 04:49:21 crc kubenswrapper[4812]: W0131 04:49:21.830454 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46aed756_2909_41b0_ba42_723012f5d95a.slice/crio-2613cc42fa1653982ad6917118373c21d4b4feb0f4943074bfbfcf8dfcc4d809 WatchSource:0}: Error finding container 2613cc42fa1653982ad6917118373c21d4b4feb0f4943074bfbfcf8dfcc4d809: Status 404 returned error can't find the container with id 2613cc42fa1653982ad6917118373c21d4b4feb0f4943074bfbfcf8dfcc4d809 Jan 31 04:49:22 crc kubenswrapper[4812]: I0131 04:49:22.714153 4812 generic.go:334] "Generic (PLEG): container finished" podID="46aed756-2909-41b0-ba42-723012f5d95a" containerID="cc7d8ebcadcecbcad8dc4992a5aa294ed31045f35da63735d828f3248e953338" exitCode=0 Jan 31 04:49:22 crc kubenswrapper[4812]: I0131 04:49:22.714465 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerDied","Data":"cc7d8ebcadcecbcad8dc4992a5aa294ed31045f35da63735d828f3248e953338"} Jan 31 04:49:22 crc kubenswrapper[4812]: I0131 04:49:22.714514 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerStarted","Data":"2613cc42fa1653982ad6917118373c21d4b4feb0f4943074bfbfcf8dfcc4d809"} Jan 31 04:49:22 crc kubenswrapper[4812]: I0131 04:49:22.716570 4812 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 04:49:22 crc kubenswrapper[4812]: I0131 04:49:22.975713 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-qkk5m"] Jan 31 04:49:22 crc kubenswrapper[4812]: I0131 04:49:22.981342 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-qkk5m"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.025431 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance2b13-account-delete-26hxs"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.043456 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-2b13-account-create-update-x5b6p"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.053931 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance2b13-account-delete-26hxs"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.064238 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-2b13-account-create-update-x5b6p"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.112279 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-svkmb"] Jan 31 04:49:23 crc kubenswrapper[4812]: E0131 04:49:23.112868 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5673e69c-fa20-4de9-96ce-db43cbb65482" containerName="mariadb-account-delete" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.112885 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5673e69c-fa20-4de9-96ce-db43cbb65482" containerName="mariadb-account-delete" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.113110 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5673e69c-fa20-4de9-96ce-db43cbb65482" containerName="mariadb-account-delete" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.114960 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.129221 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-svkmb"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.177455 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-4bcf-account-create-update-whwr9"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.178539 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.184878 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.186115 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-4bcf-account-create-update-whwr9"] Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.258074 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c43d128-0ca2-41d0-8e21-7c4620a83d53-operator-scripts\") pod \"glance-db-create-svkmb\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.258124 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m54tz\" (UniqueName: \"kubernetes.io/projected/9c43d128-0ca2-41d0-8e21-7c4620a83d53-kube-api-access-m54tz\") pod \"glance-db-create-svkmb\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.359901 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4gcvz\" (UniqueName: \"kubernetes.io/projected/9bb34317-aaf6-4136-8103-a32261401e60-kube-api-access-4gcvz\") pod \"glance-4bcf-account-create-update-whwr9\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.359963 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb34317-aaf6-4136-8103-a32261401e60-operator-scripts\") pod \"glance-4bcf-account-create-update-whwr9\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.360020 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c43d128-0ca2-41d0-8e21-7c4620a83d53-operator-scripts\") pod \"glance-db-create-svkmb\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.360096 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-m54tz\" (UniqueName: \"kubernetes.io/projected/9c43d128-0ca2-41d0-8e21-7c4620a83d53-kube-api-access-m54tz\") pod \"glance-db-create-svkmb\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.361002 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c43d128-0ca2-41d0-8e21-7c4620a83d53-operator-scripts\") pod \"glance-db-create-svkmb\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.383465 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-m54tz\" (UniqueName: \"kubernetes.io/projected/9c43d128-0ca2-41d0-8e21-7c4620a83d53-kube-api-access-m54tz\") pod \"glance-db-create-svkmb\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.447349 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.461223 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-4gcvz\" (UniqueName: \"kubernetes.io/projected/9bb34317-aaf6-4136-8103-a32261401e60-kube-api-access-4gcvz\") pod \"glance-4bcf-account-create-update-whwr9\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.461276 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb34317-aaf6-4136-8103-a32261401e60-operator-scripts\") pod \"glance-4bcf-account-create-update-whwr9\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.462091 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb34317-aaf6-4136-8103-a32261401e60-operator-scripts\") pod \"glance-4bcf-account-create-update-whwr9\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.477577 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-4gcvz\" (UniqueName: \"kubernetes.io/projected/9bb34317-aaf6-4136-8103-a32261401e60-kube-api-access-4gcvz\") pod \"glance-4bcf-account-create-update-whwr9\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.516254 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.723787 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-svkmb"] Jan 31 04:49:23 crc kubenswrapper[4812]: W0131 04:49:23.726928 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9c43d128_0ca2_41d0_8e21_7c4620a83d53.slice/crio-443ce3dccc9ab839b283022c9a81c35170018f9aee81174e199123ed2116ce7a WatchSource:0}: Error finding container 443ce3dccc9ab839b283022c9a81c35170018f9aee81174e199123ed2116ce7a: Status 404 returned error can't find the container with id 443ce3dccc9ab839b283022c9a81c35170018f9aee81174e199123ed2116ce7a Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.733144 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerStarted","Data":"89e75f16c9165466752ef57fcf4f6da6680d1d0e58f3288d2cd0a0cfbd10bd2a"} Jan 31 04:49:23 crc kubenswrapper[4812]: I0131 04:49:23.790586 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-4bcf-account-create-update-whwr9"] Jan 31 04:49:23 crc kubenswrapper[4812]: W0131 04:49:23.796027 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9bb34317_aaf6_4136_8103_a32261401e60.slice/crio-185ff4ff7c8ff8e65faac0bfe7db017d5af309f2d9010b39818c9c49444f489c WatchSource:0}: Error finding container 185ff4ff7c8ff8e65faac0bfe7db017d5af309f2d9010b39818c9c49444f489c: Status 404 returned error can't find the container with id 185ff4ff7c8ff8e65faac0bfe7db017d5af309f2d9010b39818c9c49444f489c Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.398029 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="06a85d8f-31d6-470f-8169-4bb0b8d8fe87" path="/var/lib/kubelet/pods/06a85d8f-31d6-470f-8169-4bb0b8d8fe87/volumes" Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.399033 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5673e69c-fa20-4de9-96ce-db43cbb65482" path="/var/lib/kubelet/pods/5673e69c-fa20-4de9-96ce-db43cbb65482/volumes" Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.399674 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8" path="/var/lib/kubelet/pods/fe7c0560-f3ae-45ef-bb4e-ba80474a2fb8/volumes" Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.747157 4812 generic.go:334] "Generic (PLEG): container finished" podID="9c43d128-0ca2-41d0-8e21-7c4620a83d53" containerID="366a19809bf79181e5960492acbe1cca1fef0f71242f55b20fe6c97a3855e15e" exitCode=0 Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.747242 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-svkmb" event={"ID":"9c43d128-0ca2-41d0-8e21-7c4620a83d53","Type":"ContainerDied","Data":"366a19809bf79181e5960492acbe1cca1fef0f71242f55b20fe6c97a3855e15e"} Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.747302 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-svkmb" event={"ID":"9c43d128-0ca2-41d0-8e21-7c4620a83d53","Type":"ContainerStarted","Data":"443ce3dccc9ab839b283022c9a81c35170018f9aee81174e199123ed2116ce7a"} Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.752191 4812 generic.go:334] "Generic (PLEG): container finished" podID="9bb34317-aaf6-4136-8103-a32261401e60" containerID="f75841c41655227dc054cbea93b88f07552339c5ec27980edfef4e564e71f7f6" exitCode=0 Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.752257 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" event={"ID":"9bb34317-aaf6-4136-8103-a32261401e60","Type":"ContainerDied","Data":"f75841c41655227dc054cbea93b88f07552339c5ec27980edfef4e564e71f7f6"} Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.752288 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" event={"ID":"9bb34317-aaf6-4136-8103-a32261401e60","Type":"ContainerStarted","Data":"185ff4ff7c8ff8e65faac0bfe7db017d5af309f2d9010b39818c9c49444f489c"} Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.758318 4812 generic.go:334] "Generic (PLEG): container finished" podID="46aed756-2909-41b0-ba42-723012f5d95a" containerID="89e75f16c9165466752ef57fcf4f6da6680d1d0e58f3288d2cd0a0cfbd10bd2a" exitCode=0 Jan 31 04:49:24 crc kubenswrapper[4812]: I0131 04:49:24.758384 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerDied","Data":"89e75f16c9165466752ef57fcf4f6da6680d1d0e58f3288d2cd0a0cfbd10bd2a"} Jan 31 04:49:25 crc kubenswrapper[4812]: I0131 04:49:25.773922 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerStarted","Data":"93e5cbb2bda5ac5cbe1eab6f32b1e66f5ec516a45b04e787a8561580b6805bef"} Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.218043 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.226525 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.234349 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-qq8r6" podStartSLOduration=2.753226839 podStartE2EDuration="5.234328538s" podCreationTimestamp="2026-01-31 04:49:21 +0000 UTC" firstStartedPulling="2026-01-31 04:49:22.716330097 +0000 UTC m=+1371.211351772" lastFinishedPulling="2026-01-31 04:49:25.197431796 +0000 UTC m=+1373.692453471" observedRunningTime="2026-01-31 04:49:25.806297722 +0000 UTC m=+1374.301319397" watchObservedRunningTime="2026-01-31 04:49:26.234328538 +0000 UTC m=+1374.729350203" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.314833 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c43d128-0ca2-41d0-8e21-7c4620a83d53-operator-scripts\") pod \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.314930 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m54tz\" (UniqueName: \"kubernetes.io/projected/9c43d128-0ca2-41d0-8e21-7c4620a83d53-kube-api-access-m54tz\") pod \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\" (UID: \"9c43d128-0ca2-41d0-8e21-7c4620a83d53\") " Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.315013 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb34317-aaf6-4136-8103-a32261401e60-operator-scripts\") pod \"9bb34317-aaf6-4136-8103-a32261401e60\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.315361 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9c43d128-0ca2-41d0-8e21-7c4620a83d53-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9c43d128-0ca2-41d0-8e21-7c4620a83d53" (UID: "9c43d128-0ca2-41d0-8e21-7c4620a83d53"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.315407 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9bb34317-aaf6-4136-8103-a32261401e60-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "9bb34317-aaf6-4136-8103-a32261401e60" (UID: "9bb34317-aaf6-4136-8103-a32261401e60"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.315440 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4gcvz\" (UniqueName: \"kubernetes.io/projected/9bb34317-aaf6-4136-8103-a32261401e60-kube-api-access-4gcvz\") pod \"9bb34317-aaf6-4136-8103-a32261401e60\" (UID: \"9bb34317-aaf6-4136-8103-a32261401e60\") " Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.316295 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9c43d128-0ca2-41d0-8e21-7c4620a83d53-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.316319 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/9bb34317-aaf6-4136-8103-a32261401e60-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.320737 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9c43d128-0ca2-41d0-8e21-7c4620a83d53-kube-api-access-m54tz" (OuterVolumeSpecName: "kube-api-access-m54tz") pod "9c43d128-0ca2-41d0-8e21-7c4620a83d53" (UID: "9c43d128-0ca2-41d0-8e21-7c4620a83d53"). InnerVolumeSpecName "kube-api-access-m54tz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.330411 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9bb34317-aaf6-4136-8103-a32261401e60-kube-api-access-4gcvz" (OuterVolumeSpecName: "kube-api-access-4gcvz") pod "9bb34317-aaf6-4136-8103-a32261401e60" (UID: "9bb34317-aaf6-4136-8103-a32261401e60"). InnerVolumeSpecName "kube-api-access-4gcvz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.417792 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-4gcvz\" (UniqueName: \"kubernetes.io/projected/9bb34317-aaf6-4136-8103-a32261401e60-kube-api-access-4gcvz\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.418138 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-m54tz\" (UniqueName: \"kubernetes.io/projected/9c43d128-0ca2-41d0-8e21-7c4620a83d53-kube-api-access-m54tz\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.782889 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" event={"ID":"9bb34317-aaf6-4136-8103-a32261401e60","Type":"ContainerDied","Data":"185ff4ff7c8ff8e65faac0bfe7db017d5af309f2d9010b39818c9c49444f489c"} Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.782914 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-4bcf-account-create-update-whwr9" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.782935 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="185ff4ff7c8ff8e65faac0bfe7db017d5af309f2d9010b39818c9c49444f489c" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.784984 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-svkmb" Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.785028 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-svkmb" event={"ID":"9c43d128-0ca2-41d0-8e21-7c4620a83d53","Type":"ContainerDied","Data":"443ce3dccc9ab839b283022c9a81c35170018f9aee81174e199123ed2116ce7a"} Jan 31 04:49:26 crc kubenswrapper[4812]: I0131 04:49:26.785066 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="443ce3dccc9ab839b283022c9a81c35170018f9aee81174e199123ed2116ce7a" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.483913 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-dxsqr"] Jan 31 04:49:28 crc kubenswrapper[4812]: E0131 04:49:28.484525 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9bb34317-aaf6-4136-8103-a32261401e60" containerName="mariadb-account-create-update" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.484542 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9bb34317-aaf6-4136-8103-a32261401e60" containerName="mariadb-account-create-update" Jan 31 04:49:28 crc kubenswrapper[4812]: E0131 04:49:28.484569 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9c43d128-0ca2-41d0-8e21-7c4620a83d53" containerName="mariadb-database-create" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.484576 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9c43d128-0ca2-41d0-8e21-7c4620a83d53" containerName="mariadb-database-create" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.484738 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9c43d128-0ca2-41d0-8e21-7c4620a83d53" containerName="mariadb-database-create" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.484759 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9bb34317-aaf6-4136-8103-a32261401e60" containerName="mariadb-account-create-update" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.485284 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.487304 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.487368 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-mpgds" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.502881 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dxsqr"] Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.547090 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-db-sync-config-data\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.547158 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-config-data\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.547329 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2p6nm\" (UniqueName: \"kubernetes.io/projected/3179759f-4fe9-45ee-9caa-23766a938ad7-kube-api-access-2p6nm\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.649029 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-db-sync-config-data\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.649094 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-config-data\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.649201 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2p6nm\" (UniqueName: \"kubernetes.io/projected/3179759f-4fe9-45ee-9caa-23766a938ad7-kube-api-access-2p6nm\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.655200 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-db-sync-config-data\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.665009 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2p6nm\" (UniqueName: \"kubernetes.io/projected/3179759f-4fe9-45ee-9caa-23766a938ad7-kube-api-access-2p6nm\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.665090 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-config-data\") pod \"glance-db-sync-dxsqr\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:28 crc kubenswrapper[4812]: I0131 04:49:28.800584 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:29 crc kubenswrapper[4812]: I0131 04:49:29.282999 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dxsqr"] Jan 31 04:49:29 crc kubenswrapper[4812]: I0131 04:49:29.805090 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dxsqr" event={"ID":"3179759f-4fe9-45ee-9caa-23766a938ad7","Type":"ContainerStarted","Data":"b8a71bf810c419e78f4123e70da0230be6cb0b462ce0247aabcade5de4fa8511"} Jan 31 04:49:30 crc kubenswrapper[4812]: I0131 04:49:30.818954 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dxsqr" event={"ID":"3179759f-4fe9-45ee-9caa-23766a938ad7","Type":"ContainerStarted","Data":"af36f8e080623bebcc8e00506e8941ed5df7ae530bc826a550ed41757d9aa46d"} Jan 31 04:49:30 crc kubenswrapper[4812]: I0131 04:49:30.850113 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-dxsqr" podStartSLOduration=2.85006666 podStartE2EDuration="2.85006666s" podCreationTimestamp="2026-01-31 04:49:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:49:30.846066033 +0000 UTC m=+1379.341087748" watchObservedRunningTime="2026-01-31 04:49:30.85006666 +0000 UTC m=+1379.345088335" Jan 31 04:49:31 crc kubenswrapper[4812]: I0131 04:49:31.381026 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:31 crc kubenswrapper[4812]: I0131 04:49:31.381095 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:32 crc kubenswrapper[4812]: I0131 04:49:32.439732 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-qq8r6" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="registry-server" probeResult="failure" output=< Jan 31 04:49:32 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 04:49:32 crc kubenswrapper[4812]: > Jan 31 04:49:32 crc kubenswrapper[4812]: I0131 04:49:32.836159 4812 generic.go:334] "Generic (PLEG): container finished" podID="3179759f-4fe9-45ee-9caa-23766a938ad7" containerID="af36f8e080623bebcc8e00506e8941ed5df7ae530bc826a550ed41757d9aa46d" exitCode=0 Jan 31 04:49:32 crc kubenswrapper[4812]: I0131 04:49:32.836203 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dxsqr" event={"ID":"3179759f-4fe9-45ee-9caa-23766a938ad7","Type":"ContainerDied","Data":"af36f8e080623bebcc8e00506e8941ed5df7ae530bc826a550ed41757d9aa46d"} Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.170500 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.238148 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-config-data\") pod \"3179759f-4fe9-45ee-9caa-23766a938ad7\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.238250 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-db-sync-config-data\") pod \"3179759f-4fe9-45ee-9caa-23766a938ad7\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.238326 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2p6nm\" (UniqueName: \"kubernetes.io/projected/3179759f-4fe9-45ee-9caa-23766a938ad7-kube-api-access-2p6nm\") pod \"3179759f-4fe9-45ee-9caa-23766a938ad7\" (UID: \"3179759f-4fe9-45ee-9caa-23766a938ad7\") " Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.244179 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "3179759f-4fe9-45ee-9caa-23766a938ad7" (UID: "3179759f-4fe9-45ee-9caa-23766a938ad7"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.245570 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3179759f-4fe9-45ee-9caa-23766a938ad7-kube-api-access-2p6nm" (OuterVolumeSpecName: "kube-api-access-2p6nm") pod "3179759f-4fe9-45ee-9caa-23766a938ad7" (UID: "3179759f-4fe9-45ee-9caa-23766a938ad7"). InnerVolumeSpecName "kube-api-access-2p6nm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.275203 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-config-data" (OuterVolumeSpecName: "config-data") pod "3179759f-4fe9-45ee-9caa-23766a938ad7" (UID: "3179759f-4fe9-45ee-9caa-23766a938ad7"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.339558 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.339601 4812 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/3179759f-4fe9-45ee-9caa-23766a938ad7-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.339657 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2p6nm\" (UniqueName: \"kubernetes.io/projected/3179759f-4fe9-45ee-9caa-23766a938ad7-kube-api-access-2p6nm\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.861722 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-dxsqr" event={"ID":"3179759f-4fe9-45ee-9caa-23766a938ad7","Type":"ContainerDied","Data":"b8a71bf810c419e78f4123e70da0230be6cb0b462ce0247aabcade5de4fa8511"} Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.861774 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="b8a71bf810c419e78f4123e70da0230be6cb0b462ce0247aabcade5de4fa8511" Jan 31 04:49:34 crc kubenswrapper[4812]: I0131 04:49:34.862320 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-dxsqr" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.178037 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:49:36 crc kubenswrapper[4812]: E0131 04:49:36.178663 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3179759f-4fe9-45ee-9caa-23766a938ad7" containerName="glance-db-sync" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.178677 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="3179759f-4fe9-45ee-9caa-23766a938ad7" containerName="glance-db-sync" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.178797 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="3179759f-4fe9-45ee-9caa-23766a938ad7" containerName="glance-db-sync" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.179567 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.181814 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-mpgds" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.182168 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.182174 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-single-config-data" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.193703 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269040 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-run\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269144 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-httpd-run\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269245 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269295 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-config-data\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269313 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269455 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-sys\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269525 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-scripts\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269569 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-lib-modules\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269691 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2jtf8\" (UniqueName: \"kubernetes.io/projected/8990a51f-0d9e-43a1-a775-746d8dfb656d-kube-api-access-2jtf8\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269733 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-dev\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269765 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-logs\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269843 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269949 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.269971 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.371881 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-2jtf8\" (UniqueName: \"kubernetes.io/projected/8990a51f-0d9e-43a1-a775-746d8dfb656d-kube-api-access-2jtf8\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.371936 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-dev\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.371962 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-logs\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372010 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372067 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-dev\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372446 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") device mount path \"/mnt/openstack/pv08\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372468 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372519 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372449 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") device mount path \"/mnt/openstack/pv09\"" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372577 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-logs\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372583 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-run\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372654 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-httpd-run\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372658 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-nvme\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372670 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-run\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372728 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372756 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-config-data\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372779 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372818 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-sys\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372861 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-scripts\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372885 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-httpd-run\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372891 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-iscsi\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372910 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-lib-modules\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372884 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-lib-modules\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372925 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-sys\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.372964 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-var-locks-brick\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.380224 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-config-data\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.391606 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-scripts\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.396227 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.398635 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-2jtf8\" (UniqueName: \"kubernetes.io/projected/8990a51f-0d9e-43a1-a775-746d8dfb656d-kube-api-access-2jtf8\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.400344 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"glance-default-single-0\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.543696 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:36 crc kubenswrapper[4812]: I0131 04:49:36.959466 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:49:37 crc kubenswrapper[4812]: I0131 04:49:37.887636 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8990a51f-0d9e-43a1-a775-746d8dfb656d","Type":"ContainerStarted","Data":"1578a929d4a976c5f232df3ebca97166c497e62d6258ab469ed95c5125860cd8"} Jan 31 04:49:37 crc kubenswrapper[4812]: I0131 04:49:37.888338 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8990a51f-0d9e-43a1-a775-746d8dfb656d","Type":"ContainerStarted","Data":"340139f59a726574938fdcc0b8c8409074f34c4b3128e5010675cb844fabba19"} Jan 31 04:49:37 crc kubenswrapper[4812]: I0131 04:49:37.888358 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8990a51f-0d9e-43a1-a775-746d8dfb656d","Type":"ContainerStarted","Data":"f55d6f8f62002400d842f1b4e0830f4dc457291771ae4867476fcc60ab8de912"} Jan 31 04:49:37 crc kubenswrapper[4812]: I0131 04:49:37.915685 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-0" podStartSLOduration=1.9156635990000002 podStartE2EDuration="1.915663599s" podCreationTimestamp="2026-01-31 04:49:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:49:37.911464168 +0000 UTC m=+1386.406485863" watchObservedRunningTime="2026-01-31 04:49:37.915663599 +0000 UTC m=+1386.410685294" Jan 31 04:49:41 crc kubenswrapper[4812]: I0131 04:49:41.431419 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:41 crc kubenswrapper[4812]: I0131 04:49:41.502586 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.338489 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.338878 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.355396 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.356358 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b6547cb6e9072bd2109b078699f64952c12bea692f52442e858dc4ce59b6b718"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.356465 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://b6547cb6e9072bd2109b078699f64952c12bea692f52442e858dc4ce59b6b718" gracePeriod=600 Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.954449 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="b6547cb6e9072bd2109b078699f64952c12bea692f52442e858dc4ce59b6b718" exitCode=0 Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.954535 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"b6547cb6e9072bd2109b078699f64952c12bea692f52442e858dc4ce59b6b718"} Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.955096 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3"} Jan 31 04:49:44 crc kubenswrapper[4812]: I0131 04:49:44.955161 4812 scope.go:117] "RemoveContainer" containerID="3941ab2783314337d608871e34efeae041c8fd21a85db625d2cda280e4cba1e2" Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.544212 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.544458 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.568516 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.580171 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.647769 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qq8r6"] Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.648315 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-qq8r6" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="registry-server" containerID="cri-o://93e5cbb2bda5ac5cbe1eab6f32b1e66f5ec516a45b04e787a8561580b6805bef" gracePeriod=2 Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.991199 4812 generic.go:334] "Generic (PLEG): container finished" podID="46aed756-2909-41b0-ba42-723012f5d95a" containerID="93e5cbb2bda5ac5cbe1eab6f32b1e66f5ec516a45b04e787a8561580b6805bef" exitCode=0 Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.991292 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerDied","Data":"93e5cbb2bda5ac5cbe1eab6f32b1e66f5ec516a45b04e787a8561580b6805bef"} Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.992780 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:46 crc kubenswrapper[4812]: I0131 04:49:46.992850 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.109677 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.246189 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-utilities\") pod \"46aed756-2909-41b0-ba42-723012f5d95a\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.246283 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-catalog-content\") pod \"46aed756-2909-41b0-ba42-723012f5d95a\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.246322 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vws2n\" (UniqueName: \"kubernetes.io/projected/46aed756-2909-41b0-ba42-723012f5d95a-kube-api-access-vws2n\") pod \"46aed756-2909-41b0-ba42-723012f5d95a\" (UID: \"46aed756-2909-41b0-ba42-723012f5d95a\") " Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.247089 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-utilities" (OuterVolumeSpecName: "utilities") pod "46aed756-2909-41b0-ba42-723012f5d95a" (UID: "46aed756-2909-41b0-ba42-723012f5d95a"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.253982 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/46aed756-2909-41b0-ba42-723012f5d95a-kube-api-access-vws2n" (OuterVolumeSpecName: "kube-api-access-vws2n") pod "46aed756-2909-41b0-ba42-723012f5d95a" (UID: "46aed756-2909-41b0-ba42-723012f5d95a"). InnerVolumeSpecName "kube-api-access-vws2n". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.347909 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.347962 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vws2n\" (UniqueName: \"kubernetes.io/projected/46aed756-2909-41b0-ba42-723012f5d95a-kube-api-access-vws2n\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.385283 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "46aed756-2909-41b0-ba42-723012f5d95a" (UID: "46aed756-2909-41b0-ba42-723012f5d95a"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:49:47 crc kubenswrapper[4812]: I0131 04:49:47.451233 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/46aed756-2909-41b0-ba42-723012f5d95a-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.005772 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-qq8r6" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.006943 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-qq8r6" event={"ID":"46aed756-2909-41b0-ba42-723012f5d95a","Type":"ContainerDied","Data":"2613cc42fa1653982ad6917118373c21d4b4feb0f4943074bfbfcf8dfcc4d809"} Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.007673 4812 scope.go:117] "RemoveContainer" containerID="93e5cbb2bda5ac5cbe1eab6f32b1e66f5ec516a45b04e787a8561580b6805bef" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.034215 4812 scope.go:117] "RemoveContainer" containerID="89e75f16c9165466752ef57fcf4f6da6680d1d0e58f3288d2cd0a0cfbd10bd2a" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.071000 4812 scope.go:117] "RemoveContainer" containerID="cc7d8ebcadcecbcad8dc4992a5aa294ed31045f35da63735d828f3248e953338" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.073473 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-qq8r6"] Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.081858 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-qq8r6"] Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.355960 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="46aed756-2909-41b0-ba42-723012f5d95a" path="/var/lib/kubelet/pods/46aed756-2909-41b0-ba42-723012f5d95a/volumes" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.809138 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:48 crc kubenswrapper[4812]: I0131 04:49:48.861285 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.075044 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 04:49:52 crc kubenswrapper[4812]: E0131 04:49:52.076118 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="extract-utilities" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.076150 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="extract-utilities" Jan 31 04:49:52 crc kubenswrapper[4812]: E0131 04:49:52.076197 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="extract-content" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.076210 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="extract-content" Jan 31 04:49:52 crc kubenswrapper[4812]: E0131 04:49:52.076249 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="registry-server" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.076263 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="registry-server" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.076565 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="46aed756-2909-41b0-ba42-723012f5d95a" containerName="registry-server" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.078113 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.087812 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.089036 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.096836 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.134995 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231544 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231605 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-sys\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231627 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231656 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-dev\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231675 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-nvme\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231697 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-sys\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231718 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-lib-modules\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231738 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231758 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-scripts\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231786 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231805 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231829 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-nvme\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231908 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231935 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-httpd-run\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231957 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-httpd-run\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.231980 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232008 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-lib-modules\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232028 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-dev\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232064 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mhgj4\" (UniqueName: \"kubernetes.io/projected/a184010f-94c6-4719-beeb-56bbb3ab0932-kube-api-access-mhgj4\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232427 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-scripts\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232461 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-run\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232558 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-logs\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232612 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-run\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232633 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-config-data\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232653 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-logs\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232672 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xk2bw\" (UniqueName: \"kubernetes.io/projected/1d790836-93eb-4751-8cdc-813851891bcb-kube-api-access-xk2bw\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232692 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.232721 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-config-data\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334646 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-httpd-run\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334781 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334831 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-lib-modules\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334909 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-dev\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334933 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-iscsi\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334971 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-lib-modules\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.334960 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-mhgj4\" (UniqueName: \"kubernetes.io/projected/a184010f-94c6-4719-beeb-56bbb3ab0932-kube-api-access-mhgj4\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335057 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-dev\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335142 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-run\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335177 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-scripts\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335230 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-logs\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335243 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-run\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335349 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-run\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335407 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-logs\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335456 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-config-data\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335407 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-httpd-run\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335460 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-run\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335509 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-xk2bw\" (UniqueName: \"kubernetes.io/projected/1d790836-93eb-4751-8cdc-813851891bcb-kube-api-access-xk2bw\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335951 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.335990 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-logs\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336096 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-config-data\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336123 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-iscsi\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336181 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336301 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-sys\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336349 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336428 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-dev\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336472 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-sys\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336481 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-nvme\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336535 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-logs\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336562 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-sys\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336572 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-dev\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336586 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-var-locks-brick\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336657 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-lib-modules\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336617 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-lib-modules\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336704 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336642 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-sys\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336721 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336786 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-scripts\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336852 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336875 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336917 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-nvme\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336939 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336934 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") device mount path \"/mnt/openstack/pv12\"" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336976 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-httpd-run\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.337082 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") device mount path \"/mnt/openstack/pv04\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.337428 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-httpd-run\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.337498 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-var-locks-brick\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.337560 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-nvme\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.337675 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.336596 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-nvme\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.346990 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-scripts\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.352326 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-config-data\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.356387 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-scripts\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.366067 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-config-data\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.367095 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.369444 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-mhgj4\" (UniqueName: \"kubernetes.io/projected/a184010f-94c6-4719-beeb-56bbb3ab0932-kube-api-access-mhgj4\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.373041 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-xk2bw\" (UniqueName: \"kubernetes.io/projected/1d790836-93eb-4751-8cdc-813851891bcb-kube-api-access-xk2bw\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.383595 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"glance-default-single-1\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.385471 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.389108 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-single-2\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.418750 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.436027 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.746334 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 04:49:52 crc kubenswrapper[4812]: I0131 04:49:52.921430 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:49:52 crc kubenswrapper[4812]: W0131 04:49:52.930298 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda184010f_94c6_4719_beeb_56bbb3ab0932.slice/crio-71e0b6e3ee0ec7bf2ee8c083197a0b2270c62d08bf3b812d75ee15967b9a1cc3 WatchSource:0}: Error finding container 71e0b6e3ee0ec7bf2ee8c083197a0b2270c62d08bf3b812d75ee15967b9a1cc3: Status 404 returned error can't find the container with id 71e0b6e3ee0ec7bf2ee8c083197a0b2270c62d08bf3b812d75ee15967b9a1cc3 Jan 31 04:49:53 crc kubenswrapper[4812]: I0131 04:49:53.050001 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"1d790836-93eb-4751-8cdc-813851891bcb","Type":"ContainerStarted","Data":"06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70"} Jan 31 04:49:53 crc kubenswrapper[4812]: I0131 04:49:53.050059 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"1d790836-93eb-4751-8cdc-813851891bcb","Type":"ContainerStarted","Data":"d2e901b683714c7431cdedb066eb3c5b4a6c49a98f93c86081671f394eda323a"} Jan 31 04:49:53 crc kubenswrapper[4812]: I0131 04:49:53.051338 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a184010f-94c6-4719-beeb-56bbb3ab0932","Type":"ContainerStarted","Data":"71e0b6e3ee0ec7bf2ee8c083197a0b2270c62d08bf3b812d75ee15967b9a1cc3"} Jan 31 04:49:54 crc kubenswrapper[4812]: I0131 04:49:54.063459 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a184010f-94c6-4719-beeb-56bbb3ab0932","Type":"ContainerStarted","Data":"d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c"} Jan 31 04:49:54 crc kubenswrapper[4812]: I0131 04:49:54.064280 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a184010f-94c6-4719-beeb-56bbb3ab0932","Type":"ContainerStarted","Data":"dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922"} Jan 31 04:49:54 crc kubenswrapper[4812]: I0131 04:49:54.066167 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"1d790836-93eb-4751-8cdc-813851891bcb","Type":"ContainerStarted","Data":"27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73"} Jan 31 04:49:54 crc kubenswrapper[4812]: I0131 04:49:54.101364 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-1" podStartSLOduration=3.10133484 podStartE2EDuration="3.10133484s" podCreationTimestamp="2026-01-31 04:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:49:54.088526959 +0000 UTC m=+1402.583548694" watchObservedRunningTime="2026-01-31 04:49:54.10133484 +0000 UTC m=+1402.596356545" Jan 31 04:49:54 crc kubenswrapper[4812]: I0131 04:49:54.123602 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-single-2" podStartSLOduration=3.123576551 podStartE2EDuration="3.123576551s" podCreationTimestamp="2026-01-31 04:49:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:49:54.121488046 +0000 UTC m=+1402.616509751" watchObservedRunningTime="2026-01-31 04:49:54.123576551 +0000 UTC m=+1402.618598246" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.420427 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.421811 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.436513 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.436558 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.460015 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.465221 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.479288 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:02 crc kubenswrapper[4812]: I0131 04:50:02.500893 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:03 crc kubenswrapper[4812]: I0131 04:50:03.147894 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:03 crc kubenswrapper[4812]: I0131 04:50:03.147952 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:03 crc kubenswrapper[4812]: I0131 04:50:03.149033 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:03 crc kubenswrapper[4812]: I0131 04:50:03.149064 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:05 crc kubenswrapper[4812]: I0131 04:50:05.136302 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:05 crc kubenswrapper[4812]: I0131 04:50:05.149335 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:05 crc kubenswrapper[4812]: I0131 04:50:05.165196 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:50:05 crc kubenswrapper[4812]: I0131 04:50:05.165243 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:50:05 crc kubenswrapper[4812]: I0131 04:50:05.234205 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:05 crc kubenswrapper[4812]: I0131 04:50:05.324637 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:06 crc kubenswrapper[4812]: I0131 04:50:06.538315 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 04:50:06 crc kubenswrapper[4812]: I0131 04:50:06.568100 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:50:08 crc kubenswrapper[4812]: I0131 04:50:08.189356 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-log" containerID="cri-o://06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70" gracePeriod=30 Jan 31 04:50:08 crc kubenswrapper[4812]: I0131 04:50:08.189640 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-httpd" containerID="cri-o://d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c" gracePeriod=30 Jan 31 04:50:08 crc kubenswrapper[4812]: I0131 04:50:08.189462 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-2" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-httpd" containerID="cri-o://27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73" gracePeriod=30 Jan 31 04:50:08 crc kubenswrapper[4812]: I0131 04:50:08.189526 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-1" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-log" containerID="cri-o://dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922" gracePeriod=30 Jan 31 04:50:09 crc kubenswrapper[4812]: I0131 04:50:09.202733 4812 generic.go:334] "Generic (PLEG): container finished" podID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerID="dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922" exitCode=143 Jan 31 04:50:09 crc kubenswrapper[4812]: I0131 04:50:09.202832 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a184010f-94c6-4719-beeb-56bbb3ab0932","Type":"ContainerDied","Data":"dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922"} Jan 31 04:50:09 crc kubenswrapper[4812]: I0131 04:50:09.206146 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d790836-93eb-4751-8cdc-813851891bcb" containerID="06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70" exitCode=143 Jan 31 04:50:09 crc kubenswrapper[4812]: I0131 04:50:09.206193 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"1d790836-93eb-4751-8cdc-813851891bcb","Type":"ContainerDied","Data":"06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70"} Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.756220 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770050 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-sys\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770083 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-lib-modules\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770100 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-iscsi\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770136 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-scripts\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770169 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770195 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-nvme\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770224 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-logs\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770214 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-sys" (OuterVolumeSpecName: "sys") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770243 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-dev\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770260 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xk2bw\" (UniqueName: \"kubernetes.io/projected/1d790836-93eb-4751-8cdc-813851891bcb-kube-api-access-xk2bw\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770263 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770277 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770294 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770344 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-httpd-run\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770401 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-run\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770435 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-var-locks-brick\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770353 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-dev" (OuterVolumeSpecName: "dev") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770371 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770758 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770475 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-config-data\") pod \"1d790836-93eb-4751-8cdc-813851891bcb\" (UID: \"1d790836-93eb-4751-8cdc-813851891bcb\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770807 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-logs" (OuterVolumeSpecName: "logs") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770876 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-run" (OuterVolumeSpecName: "run") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.770898 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771328 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771349 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771359 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771368 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771376 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771383 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771391 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/1d790836-93eb-4751-8cdc-813851891bcb-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771399 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.771406 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/1d790836-93eb-4751-8cdc-813851891bcb-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.776525 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.776648 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-scripts" (OuterVolumeSpecName: "scripts") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.776722 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1d790836-93eb-4751-8cdc-813851891bcb-kube-api-access-xk2bw" (OuterVolumeSpecName: "kube-api-access-xk2bw") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "kube-api-access-xk2bw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.776736 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage12-crc" (OuterVolumeSpecName: "glance-cache") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "local-storage12-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.845948 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.867248 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-config-data" (OuterVolumeSpecName: "config-data") pod "1d790836-93eb-4751-8cdc-813851891bcb" (UID: "1d790836-93eb-4751-8cdc-813851891bcb"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.872911 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-iscsi\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.872995 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.873014 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-dev\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.873031 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-dev" (OuterVolumeSpecName: "dev") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.873114 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhgj4\" (UniqueName: \"kubernetes.io/projected/a184010f-94c6-4719-beeb-56bbb3ab0932-kube-api-access-mhgj4\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.873726 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.873812 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-logs\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874354 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-logs" (OuterVolumeSpecName: "logs") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874417 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-config-data\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874474 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-var-locks-brick\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874496 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874541 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-sys\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874579 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-nvme\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875004 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-httpd-run\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875039 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-scripts\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875062 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-lib-modules\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875081 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-run\") pod \"a184010f-94c6-4719-beeb-56bbb3ab0932\" (UID: \"a184010f-94c6-4719-beeb-56bbb3ab0932\") " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874632 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874918 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-sys" (OuterVolumeSpecName: "sys") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.874944 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875587 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875750 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.875890 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-run" (OuterVolumeSpecName: "run") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878790 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878819 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xk2bw\" (UniqueName: \"kubernetes.io/projected/1d790836-93eb-4751-8cdc-813851891bcb-kube-api-access-xk2bw\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878875 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878889 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878900 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878911 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878924 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878934 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/a184010f-94c6-4719-beeb-56bbb3ab0932-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878948 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878960 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878971 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/1d790836-93eb-4751-8cdc-813851891bcb-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.878983 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.879008 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.879020 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/a184010f-94c6-4719-beeb-56bbb3ab0932-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.879793 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance-cache") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.880122 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-scripts" (OuterVolumeSpecName: "scripts") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.882223 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a184010f-94c6-4719-beeb-56bbb3ab0932-kube-api-access-mhgj4" (OuterVolumeSpecName: "kube-api-access-mhgj4") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "kube-api-access-mhgj4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.887255 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage04-crc" (OuterVolumeSpecName: "glance") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "local-storage04-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.894765 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage12-crc" (UniqueName: "kubernetes.io/local-volume/local-storage12-crc") on node "crc" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.910303 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.938318 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-config-data" (OuterVolumeSpecName: "config-data") pod "a184010f-94c6-4719-beeb-56bbb3ab0932" (UID: "a184010f-94c6-4719-beeb-56bbb3ab0932"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980299 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980333 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980343 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980352 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhgj4\" (UniqueName: \"kubernetes.io/projected/a184010f-94c6-4719-beeb-56bbb3ab0932-kube-api-access-mhgj4\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980367 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" " Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980376 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/a184010f-94c6-4719-beeb-56bbb3ab0932-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.980385 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage12-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage12-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.992113 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage04-crc" (UniqueName: "kubernetes.io/local-volume/local-storage04-crc") on node "crc" Jan 31 04:50:11 crc kubenswrapper[4812]: I0131 04:50:11.992943 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.081933 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage04-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage04-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.081967 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.237439 4812 generic.go:334] "Generic (PLEG): container finished" podID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerID="d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c" exitCode=0 Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.237530 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a184010f-94c6-4719-beeb-56bbb3ab0932","Type":"ContainerDied","Data":"d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c"} Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.237561 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-1" event={"ID":"a184010f-94c6-4719-beeb-56bbb3ab0932","Type":"ContainerDied","Data":"71e0b6e3ee0ec7bf2ee8c083197a0b2270c62d08bf3b812d75ee15967b9a1cc3"} Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.237559 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-1" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.237581 4812 scope.go:117] "RemoveContainer" containerID="d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.240318 4812 generic.go:334] "Generic (PLEG): container finished" podID="1d790836-93eb-4751-8cdc-813851891bcb" containerID="27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73" exitCode=0 Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.240349 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"1d790836-93eb-4751-8cdc-813851891bcb","Type":"ContainerDied","Data":"27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73"} Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.240381 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-2" event={"ID":"1d790836-93eb-4751-8cdc-813851891bcb","Type":"ContainerDied","Data":"d2e901b683714c7431cdedb066eb3c5b4a6c49a98f93c86081671f394eda323a"} Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.240405 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-2" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.272266 4812 scope.go:117] "RemoveContainer" containerID="dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.273121 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.280066 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-2"] Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.294025 4812 scope.go:117] "RemoveContainer" containerID="d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c" Jan 31 04:50:12 crc kubenswrapper[4812]: E0131 04:50:12.294500 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c\": container with ID starting with d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c not found: ID does not exist" containerID="d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.294536 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c"} err="failed to get container status \"d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c\": rpc error: code = NotFound desc = could not find container \"d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c\": container with ID starting with d078de41f1fb131026864a791f3d8ce043cc0ff46f634a29dcec1da56bda3a3c not found: ID does not exist" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.294562 4812 scope.go:117] "RemoveContainer" containerID="dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922" Jan 31 04:50:12 crc kubenswrapper[4812]: E0131 04:50:12.294958 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922\": container with ID starting with dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922 not found: ID does not exist" containerID="dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.295007 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922"} err="failed to get container status \"dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922\": rpc error: code = NotFound desc = could not find container \"dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922\": container with ID starting with dd6de6421cddc828fb64f0fa44c9aff580f44b8711f4894b506f9cfcf7fbe922 not found: ID does not exist" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.295022 4812 scope.go:117] "RemoveContainer" containerID="27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.295053 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.301278 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-1"] Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.331214 4812 scope.go:117] "RemoveContainer" containerID="06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.348612 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d790836-93eb-4751-8cdc-813851891bcb" path="/var/lib/kubelet/pods/1d790836-93eb-4751-8cdc-813851891bcb/volumes" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.349243 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" path="/var/lib/kubelet/pods/a184010f-94c6-4719-beeb-56bbb3ab0932/volumes" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.361750 4812 scope.go:117] "RemoveContainer" containerID="27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73" Jan 31 04:50:12 crc kubenswrapper[4812]: E0131 04:50:12.362215 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73\": container with ID starting with 27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73 not found: ID does not exist" containerID="27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.362257 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73"} err="failed to get container status \"27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73\": rpc error: code = NotFound desc = could not find container \"27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73\": container with ID starting with 27a283f9349c3602e9f5d40b4ff0bf8248903ce9feeb5cc2ea4a24fb2d04fe73 not found: ID does not exist" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.362283 4812 scope.go:117] "RemoveContainer" containerID="06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70" Jan 31 04:50:12 crc kubenswrapper[4812]: E0131 04:50:12.362830 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70\": container with ID starting with 06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70 not found: ID does not exist" containerID="06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.362887 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70"} err="failed to get container status \"06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70\": rpc error: code = NotFound desc = could not find container \"06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70\": container with ID starting with 06fd56e194e6960370f38611ae5add841732a4b7a4dc5c83aca1f22b4add8c70 not found: ID does not exist" Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.865320 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.866008 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-log" containerID="cri-o://340139f59a726574938fdcc0b8c8409074f34c4b3128e5010675cb844fabba19" gracePeriod=30 Jan 31 04:50:12 crc kubenswrapper[4812]: I0131 04:50:12.866108 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-single-0" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-httpd" containerID="cri-o://1578a929d4a976c5f232df3ebca97166c497e62d6258ab469ed95c5125860cd8" gracePeriod=30 Jan 31 04:50:13 crc kubenswrapper[4812]: I0131 04:50:13.251916 4812 generic.go:334] "Generic (PLEG): container finished" podID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerID="340139f59a726574938fdcc0b8c8409074f34c4b3128e5010675cb844fabba19" exitCode=143 Jan 31 04:50:13 crc kubenswrapper[4812]: I0131 04:50:13.252032 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8990a51f-0d9e-43a1-a775-746d8dfb656d","Type":"ContainerDied","Data":"340139f59a726574938fdcc0b8c8409074f34c4b3128e5010675cb844fabba19"} Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.280487 4812 generic.go:334] "Generic (PLEG): container finished" podID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerID="1578a929d4a976c5f232df3ebca97166c497e62d6258ab469ed95c5125860cd8" exitCode=0 Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.280740 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8990a51f-0d9e-43a1-a775-746d8dfb656d","Type":"ContainerDied","Data":"1578a929d4a976c5f232df3ebca97166c497e62d6258ab469ed95c5125860cd8"} Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.409429 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550127 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-config-data\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550172 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-var-locks-brick\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550210 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-iscsi\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550244 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-sys\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550298 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-httpd-run\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550312 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-dev\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550307 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550329 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550329 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-logs\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550374 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-dev" (OuterVolumeSpecName: "dev") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550409 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-sys" (OuterVolumeSpecName: "sys") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550487 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-2jtf8\" (UniqueName: \"kubernetes.io/projected/8990a51f-0d9e-43a1-a775-746d8dfb656d-kube-api-access-2jtf8\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550527 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-nvme\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550551 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550570 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550617 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-run\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550650 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-scripts\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550686 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-lib-modules\") pod \"8990a51f-0d9e-43a1-a775-746d8dfb656d\" (UID: \"8990a51f-0d9e-43a1-a775-746d8dfb656d\") " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550570 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550596 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550719 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-run" (OuterVolumeSpecName: "run") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.550752 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551279 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551304 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551317 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551327 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551337 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551340 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-logs" (OuterVolumeSpecName: "logs") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551362 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551374 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.551385 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/8990a51f-0d9e-43a1-a775-746d8dfb656d-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.555257 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8990a51f-0d9e-43a1-a775-746d8dfb656d-kube-api-access-2jtf8" (OuterVolumeSpecName: "kube-api-access-2jtf8") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "kube-api-access-2jtf8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.555496 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage08-crc" (OuterVolumeSpecName: "glance-cache") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "local-storage08-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.555715 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-scripts" (OuterVolumeSpecName: "scripts") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.568986 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage09-crc" (OuterVolumeSpecName: "glance") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "local-storage09-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.585547 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-config-data" (OuterVolumeSpecName: "config-data") pod "8990a51f-0d9e-43a1-a775-746d8dfb656d" (UID: "8990a51f-0d9e-43a1-a775-746d8dfb656d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.653297 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.653341 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/8990a51f-0d9e-43a1-a775-746d8dfb656d-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.653354 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-2jtf8\" (UniqueName: \"kubernetes.io/projected/8990a51f-0d9e-43a1-a775-746d8dfb656d-kube-api-access-2jtf8\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.653395 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.653413 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" " Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.653424 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/8990a51f-0d9e-43a1-a775-746d8dfb656d-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.668938 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage08-crc" (UniqueName: "kubernetes.io/local-volume/local-storage08-crc") on node "crc" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.674483 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage09-crc" (UniqueName: "kubernetes.io/local-volume/local-storage09-crc") on node "crc" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.754256 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage09-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage09-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:16 crc kubenswrapper[4812]: I0131 04:50:16.754287 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage08-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage08-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:17 crc kubenswrapper[4812]: I0131 04:50:17.292200 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-single-0" event={"ID":"8990a51f-0d9e-43a1-a775-746d8dfb656d","Type":"ContainerDied","Data":"f55d6f8f62002400d842f1b4e0830f4dc457291771ae4867476fcc60ab8de912"} Jan 31 04:50:17 crc kubenswrapper[4812]: I0131 04:50:17.292481 4812 scope.go:117] "RemoveContainer" containerID="1578a929d4a976c5f232df3ebca97166c497e62d6258ab469ed95c5125860cd8" Jan 31 04:50:17 crc kubenswrapper[4812]: I0131 04:50:17.292306 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-single-0" Jan 31 04:50:17 crc kubenswrapper[4812]: I0131 04:50:17.326313 4812 scope.go:117] "RemoveContainer" containerID="340139f59a726574938fdcc0b8c8409074f34c4b3128e5010675cb844fabba19" Jan 31 04:50:17 crc kubenswrapper[4812]: I0131 04:50:17.333934 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:50:17 crc kubenswrapper[4812]: I0131 04:50:17.343517 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-single-0"] Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.169974 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dxsqr"] Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.180003 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-dxsqr"] Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213026 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance4bcf-account-delete-wptll"] Jan 31 04:50:18 crc kubenswrapper[4812]: E0131 04:50:18.213459 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213489 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: E0131 04:50:18.213522 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213536 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: E0131 04:50:18.213554 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213567 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: E0131 04:50:18.213587 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213598 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: E0131 04:50:18.213615 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213625 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: E0131 04:50:18.213648 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213660 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213895 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213928 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213950 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.213974 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a184010f-94c6-4719-beeb-56bbb3ab0932" containerName="glance-httpd" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.214000 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="1d790836-93eb-4751-8cdc-813851891bcb" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.214016 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" containerName="glance-log" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.214738 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.225853 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance4bcf-account-delete-wptll"] Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.348862 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3179759f-4fe9-45ee-9caa-23766a938ad7" path="/var/lib/kubelet/pods/3179759f-4fe9-45ee-9caa-23766a938ad7/volumes" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.349711 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8990a51f-0d9e-43a1-a775-746d8dfb656d" path="/var/lib/kubelet/pods/8990a51f-0d9e-43a1-a775-746d8dfb656d/volumes" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.382977 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-85bz6\" (UniqueName: \"kubernetes.io/projected/caf3bd57-eaaa-4687-adda-d72b74862302-kube-api-access-85bz6\") pod \"glance4bcf-account-delete-wptll\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.383062 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf3bd57-eaaa-4687-adda-d72b74862302-operator-scripts\") pod \"glance4bcf-account-delete-wptll\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.483854 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf3bd57-eaaa-4687-adda-d72b74862302-operator-scripts\") pod \"glance4bcf-account-delete-wptll\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.483959 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-85bz6\" (UniqueName: \"kubernetes.io/projected/caf3bd57-eaaa-4687-adda-d72b74862302-kube-api-access-85bz6\") pod \"glance4bcf-account-delete-wptll\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.484689 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf3bd57-eaaa-4687-adda-d72b74862302-operator-scripts\") pod \"glance4bcf-account-delete-wptll\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.508498 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-85bz6\" (UniqueName: \"kubernetes.io/projected/caf3bd57-eaaa-4687-adda-d72b74862302-kube-api-access-85bz6\") pod \"glance4bcf-account-delete-wptll\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.535043 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:18 crc kubenswrapper[4812]: I0131 04:50:18.972683 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance4bcf-account-delete-wptll"] Jan 31 04:50:19 crc kubenswrapper[4812]: I0131 04:50:19.308778 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" event={"ID":"caf3bd57-eaaa-4687-adda-d72b74862302","Type":"ContainerStarted","Data":"8901e46123fe46ba1433e007ba32f598219da20e58162e7df92c8bb695ec658f"} Jan 31 04:50:19 crc kubenswrapper[4812]: I0131 04:50:19.308822 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" event={"ID":"caf3bd57-eaaa-4687-adda-d72b74862302","Type":"ContainerStarted","Data":"4024c6ded2ad111b11625589d2303acdac2957ba2c6223fe72fe0b96ad1eca56"} Jan 31 04:50:19 crc kubenswrapper[4812]: I0131 04:50:19.326818 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" podStartSLOduration=1.326801806 podStartE2EDuration="1.326801806s" podCreationTimestamp="2026-01-31 04:50:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:19.323343024 +0000 UTC m=+1427.818364689" watchObservedRunningTime="2026-01-31 04:50:19.326801806 +0000 UTC m=+1427.821823471" Jan 31 04:50:20 crc kubenswrapper[4812]: I0131 04:50:20.317927 4812 generic.go:334] "Generic (PLEG): container finished" podID="caf3bd57-eaaa-4687-adda-d72b74862302" containerID="8901e46123fe46ba1433e007ba32f598219da20e58162e7df92c8bb695ec658f" exitCode=0 Jan 31 04:50:20 crc kubenswrapper[4812]: I0131 04:50:20.317989 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" event={"ID":"caf3bd57-eaaa-4687-adda-d72b74862302","Type":"ContainerDied","Data":"8901e46123fe46ba1433e007ba32f598219da20e58162e7df92c8bb695ec658f"} Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.415598 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.416906 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.419316 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"default-dockercfg-f54zk" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.420700 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-config-secret" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.420819 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-config" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.421235 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"glance-kuttl-tests"/"openstack-scripts-9db6gc427h" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.441982 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.530568 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.530975 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.531043 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ttxt4\" (UniqueName: \"kubernetes.io/projected/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-kube-api-access-ttxt4\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.531158 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-scripts\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.632527 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-scripts\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.632609 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.632640 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.632659 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ttxt4\" (UniqueName: \"kubernetes.io/projected/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-kube-api-access-ttxt4\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.635713 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config\" (UniqueName: \"kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.636319 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-scripts\" (UniqueName: \"kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-scripts\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.639393 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"openstack-config-secret\" (UniqueName: \"kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.646906 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ttxt4\" (UniqueName: \"kubernetes.io/projected/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-kube-api-access-ttxt4\") pod \"openstackclient\" (UID: \"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62\") " pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.694815 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.756174 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstackclient" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.836603 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-85bz6\" (UniqueName: \"kubernetes.io/projected/caf3bd57-eaaa-4687-adda-d72b74862302-kube-api-access-85bz6\") pod \"caf3bd57-eaaa-4687-adda-d72b74862302\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.837132 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf3bd57-eaaa-4687-adda-d72b74862302-operator-scripts\") pod \"caf3bd57-eaaa-4687-adda-d72b74862302\" (UID: \"caf3bd57-eaaa-4687-adda-d72b74862302\") " Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.838688 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/caf3bd57-eaaa-4687-adda-d72b74862302-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "caf3bd57-eaaa-4687-adda-d72b74862302" (UID: "caf3bd57-eaaa-4687-adda-d72b74862302"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.839044 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caf3bd57-eaaa-4687-adda-d72b74862302-kube-api-access-85bz6" (OuterVolumeSpecName: "kube-api-access-85bz6") pod "caf3bd57-eaaa-4687-adda-d72b74862302" (UID: "caf3bd57-eaaa-4687-adda-d72b74862302"). InnerVolumeSpecName "kube-api-access-85bz6". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.938767 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/caf3bd57-eaaa-4687-adda-d72b74862302-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:21 crc kubenswrapper[4812]: I0131 04:50:21.938808 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-85bz6\" (UniqueName: \"kubernetes.io/projected/caf3bd57-eaaa-4687-adda-d72b74862302-kube-api-access-85bz6\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:22 crc kubenswrapper[4812]: I0131 04:50:22.182108 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/openstackclient"] Jan 31 04:50:22 crc kubenswrapper[4812]: W0131 04:50:22.184353 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf0f9c46_9e7f_4e5e_832a_5c07ed5e3d62.slice/crio-7b941b506b561c81687c6f1cc1f140af003c0584afd4d2d1dcf9a9a5b0bd3115 WatchSource:0}: Error finding container 7b941b506b561c81687c6f1cc1f140af003c0584afd4d2d1dcf9a9a5b0bd3115: Status 404 returned error can't find the container with id 7b941b506b561c81687c6f1cc1f140af003c0584afd4d2d1dcf9a9a5b0bd3115 Jan 31 04:50:22 crc kubenswrapper[4812]: I0131 04:50:22.336558 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" event={"ID":"caf3bd57-eaaa-4687-adda-d72b74862302","Type":"ContainerDied","Data":"4024c6ded2ad111b11625589d2303acdac2957ba2c6223fe72fe0b96ad1eca56"} Jan 31 04:50:22 crc kubenswrapper[4812]: I0131 04:50:22.336626 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4024c6ded2ad111b11625589d2303acdac2957ba2c6223fe72fe0b96ad1eca56" Jan 31 04:50:22 crc kubenswrapper[4812]: I0131 04:50:22.336713 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance4bcf-account-delete-wptll" Jan 31 04:50:22 crc kubenswrapper[4812]: I0131 04:50:22.371607 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62","Type":"ContainerStarted","Data":"7b941b506b561c81687c6f1cc1f140af003c0584afd4d2d1dcf9a9a5b0bd3115"} Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.264904 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-svkmb"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.279549 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-svkmb"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.285725 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-4bcf-account-create-update-whwr9"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.290773 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance4bcf-account-delete-wptll"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.295960 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-4bcf-account-create-update-whwr9"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.301948 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance4bcf-account-delete-wptll"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.345214 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-create-thhl4"] Jan 31 04:50:23 crc kubenswrapper[4812]: E0131 04:50:23.345637 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="caf3bd57-eaaa-4687-adda-d72b74862302" containerName="mariadb-account-delete" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.345666 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="caf3bd57-eaaa-4687-adda-d72b74862302" containerName="mariadb-account-delete" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.345896 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="caf3bd57-eaaa-4687-adda-d72b74862302" containerName="mariadb-account-delete" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.346485 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.355775 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-thhl4"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.357202 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstackclient" event={"ID":"bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62","Type":"ContainerStarted","Data":"6dfbd329a18b347c12a70025c9d4e3c1ba94bbe240494c01073fab78e7c96423"} Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.389464 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/openstackclient" podStartSLOduration=2.389442195 podStartE2EDuration="2.389442195s" podCreationTimestamp="2026-01-31 04:50:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:23.384609896 +0000 UTC m=+1431.879631581" watchObservedRunningTime="2026-01-31 04:50:23.389442195 +0000 UTC m=+1431.884463870" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.441392 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-a2e9-account-create-update-m4fls"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.442182 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.445150 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-db-secret" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.453594 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a2e9-account-create-update-m4fls"] Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.461374 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dtf9l\" (UniqueName: \"kubernetes.io/projected/bf00dafa-494f-411d-9155-4bc61d08b46b-kube-api-access-dtf9l\") pod \"glance-db-create-thhl4\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.461557 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf00dafa-494f-411d-9155-4bc61d08b46b-operator-scripts\") pod \"glance-db-create-thhl4\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.563653 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae1265c-41c1-4871-998d-fd3f17598b8e-operator-scripts\") pod \"glance-a2e9-account-create-update-m4fls\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.564394 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf00dafa-494f-411d-9155-4bc61d08b46b-operator-scripts\") pod \"glance-db-create-thhl4\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.564453 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8h8s5\" (UniqueName: \"kubernetes.io/projected/6ae1265c-41c1-4871-998d-fd3f17598b8e-kube-api-access-8h8s5\") pod \"glance-a2e9-account-create-update-m4fls\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.564554 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-dtf9l\" (UniqueName: \"kubernetes.io/projected/bf00dafa-494f-411d-9155-4bc61d08b46b-kube-api-access-dtf9l\") pod \"glance-db-create-thhl4\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.565369 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf00dafa-494f-411d-9155-4bc61d08b46b-operator-scripts\") pod \"glance-db-create-thhl4\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.588548 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-dtf9l\" (UniqueName: \"kubernetes.io/projected/bf00dafa-494f-411d-9155-4bc61d08b46b-kube-api-access-dtf9l\") pod \"glance-db-create-thhl4\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.666623 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae1265c-41c1-4871-998d-fd3f17598b8e-operator-scripts\") pod \"glance-a2e9-account-create-update-m4fls\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.666742 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-8h8s5\" (UniqueName: \"kubernetes.io/projected/6ae1265c-41c1-4871-998d-fd3f17598b8e-kube-api-access-8h8s5\") pod \"glance-a2e9-account-create-update-m4fls\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.667977 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae1265c-41c1-4871-998d-fd3f17598b8e-operator-scripts\") pod \"glance-a2e9-account-create-update-m4fls\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.669094 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.690937 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-8h8s5\" (UniqueName: \"kubernetes.io/projected/6ae1265c-41c1-4871-998d-fd3f17598b8e-kube-api-access-8h8s5\") pod \"glance-a2e9-account-create-update-m4fls\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:23 crc kubenswrapper[4812]: I0131 04:50:23.759597 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.218613 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-create-thhl4"] Jan 31 04:50:24 crc kubenswrapper[4812]: W0131 04:50:24.224523 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podbf00dafa_494f_411d_9155_4bc61d08b46b.slice/crio-f25f4e0bc1b683c3e848e66ef86cc1565c763866e0969760e3637e04af8a7e9f WatchSource:0}: Error finding container f25f4e0bc1b683c3e848e66ef86cc1565c763866e0969760e3637e04af8a7e9f: Status 404 returned error can't find the container with id f25f4e0bc1b683c3e848e66ef86cc1565c763866e0969760e3637e04af8a7e9f Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.255967 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-a2e9-account-create-update-m4fls"] Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.355307 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9bb34317-aaf6-4136-8103-a32261401e60" path="/var/lib/kubelet/pods/9bb34317-aaf6-4136-8103-a32261401e60/volumes" Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.358425 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9c43d128-0ca2-41d0-8e21-7c4620a83d53" path="/var/lib/kubelet/pods/9c43d128-0ca2-41d0-8e21-7c4620a83d53/volumes" Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.359803 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caf3bd57-eaaa-4687-adda-d72b74862302" path="/var/lib/kubelet/pods/caf3bd57-eaaa-4687-adda-d72b74862302/volumes" Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.369123 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" event={"ID":"6ae1265c-41c1-4871-998d-fd3f17598b8e","Type":"ContainerStarted","Data":"87076766cb4ee68666503c64e58f0dfae9019142ad96dff5b2e7089daddbecfe"} Jan 31 04:50:24 crc kubenswrapper[4812]: I0131 04:50:24.372079 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-thhl4" event={"ID":"bf00dafa-494f-411d-9155-4bc61d08b46b","Type":"ContainerStarted","Data":"f25f4e0bc1b683c3e848e66ef86cc1565c763866e0969760e3637e04af8a7e9f"} Jan 31 04:50:25 crc kubenswrapper[4812]: I0131 04:50:25.381210 4812 generic.go:334] "Generic (PLEG): container finished" podID="6ae1265c-41c1-4871-998d-fd3f17598b8e" containerID="8019826c3c57b77db5b466c6e52567e140ee905c9b8a713f60230f5f269213f7" exitCode=0 Jan 31 04:50:25 crc kubenswrapper[4812]: I0131 04:50:25.381289 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" event={"ID":"6ae1265c-41c1-4871-998d-fd3f17598b8e","Type":"ContainerDied","Data":"8019826c3c57b77db5b466c6e52567e140ee905c9b8a713f60230f5f269213f7"} Jan 31 04:50:25 crc kubenswrapper[4812]: I0131 04:50:25.384234 4812 generic.go:334] "Generic (PLEG): container finished" podID="bf00dafa-494f-411d-9155-4bc61d08b46b" containerID="ddf4600fa4e9f3d01c4770936f8efa571970639e0ef06ce4d63665ade579e58d" exitCode=0 Jan 31 04:50:25 crc kubenswrapper[4812]: I0131 04:50:25.384273 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-thhl4" event={"ID":"bf00dafa-494f-411d-9155-4bc61d08b46b","Type":"ContainerDied","Data":"ddf4600fa4e9f3d01c4770936f8efa571970639e0ef06ce4d63665ade579e58d"} Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.790062 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.794317 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.874229 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dtf9l\" (UniqueName: \"kubernetes.io/projected/bf00dafa-494f-411d-9155-4bc61d08b46b-kube-api-access-dtf9l\") pod \"bf00dafa-494f-411d-9155-4bc61d08b46b\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.874303 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae1265c-41c1-4871-998d-fd3f17598b8e-operator-scripts\") pod \"6ae1265c-41c1-4871-998d-fd3f17598b8e\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.874400 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf00dafa-494f-411d-9155-4bc61d08b46b-operator-scripts\") pod \"bf00dafa-494f-411d-9155-4bc61d08b46b\" (UID: \"bf00dafa-494f-411d-9155-4bc61d08b46b\") " Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.874461 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8h8s5\" (UniqueName: \"kubernetes.io/projected/6ae1265c-41c1-4871-998d-fd3f17598b8e-kube-api-access-8h8s5\") pod \"6ae1265c-41c1-4871-998d-fd3f17598b8e\" (UID: \"6ae1265c-41c1-4871-998d-fd3f17598b8e\") " Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.874868 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/6ae1265c-41c1-4871-998d-fd3f17598b8e-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "6ae1265c-41c1-4871-998d-fd3f17598b8e" (UID: "6ae1265c-41c1-4871-998d-fd3f17598b8e"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.875194 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/bf00dafa-494f-411d-9155-4bc61d08b46b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "bf00dafa-494f-411d-9155-4bc61d08b46b" (UID: "bf00dafa-494f-411d-9155-4bc61d08b46b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.879431 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6ae1265c-41c1-4871-998d-fd3f17598b8e-kube-api-access-8h8s5" (OuterVolumeSpecName: "kube-api-access-8h8s5") pod "6ae1265c-41c1-4871-998d-fd3f17598b8e" (UID: "6ae1265c-41c1-4871-998d-fd3f17598b8e"). InnerVolumeSpecName "kube-api-access-8h8s5". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.881121 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bf00dafa-494f-411d-9155-4bc61d08b46b-kube-api-access-dtf9l" (OuterVolumeSpecName: "kube-api-access-dtf9l") pod "bf00dafa-494f-411d-9155-4bc61d08b46b" (UID: "bf00dafa-494f-411d-9155-4bc61d08b46b"). InnerVolumeSpecName "kube-api-access-dtf9l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.976512 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dtf9l\" (UniqueName: \"kubernetes.io/projected/bf00dafa-494f-411d-9155-4bc61d08b46b-kube-api-access-dtf9l\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.976547 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/6ae1265c-41c1-4871-998d-fd3f17598b8e-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.976558 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/bf00dafa-494f-411d-9155-4bc61d08b46b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:26 crc kubenswrapper[4812]: I0131 04:50:26.976568 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-8h8s5\" (UniqueName: \"kubernetes.io/projected/6ae1265c-41c1-4871-998d-fd3f17598b8e-kube-api-access-8h8s5\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:27 crc kubenswrapper[4812]: I0131 04:50:27.409130 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" Jan 31 04:50:27 crc kubenswrapper[4812]: I0131 04:50:27.410741 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-a2e9-account-create-update-m4fls" event={"ID":"6ae1265c-41c1-4871-998d-fd3f17598b8e","Type":"ContainerDied","Data":"87076766cb4ee68666503c64e58f0dfae9019142ad96dff5b2e7089daddbecfe"} Jan 31 04:50:27 crc kubenswrapper[4812]: I0131 04:50:27.410820 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="87076766cb4ee68666503c64e58f0dfae9019142ad96dff5b2e7089daddbecfe" Jan 31 04:50:27 crc kubenswrapper[4812]: I0131 04:50:27.411316 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-create-thhl4" event={"ID":"bf00dafa-494f-411d-9155-4bc61d08b46b","Type":"ContainerDied","Data":"f25f4e0bc1b683c3e848e66ef86cc1565c763866e0969760e3637e04af8a7e9f"} Jan 31 04:50:27 crc kubenswrapper[4812]: I0131 04:50:27.411341 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f25f4e0bc1b683c3e848e66ef86cc1565c763866e0969760e3637e04af8a7e9f" Jan 31 04:50:27 crc kubenswrapper[4812]: I0131 04:50:27.411422 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-create-thhl4" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.667767 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-db-sync-g4hqp"] Jan 31 04:50:28 crc kubenswrapper[4812]: E0131 04:50:28.668193 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bf00dafa-494f-411d-9155-4bc61d08b46b" containerName="mariadb-database-create" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.668216 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bf00dafa-494f-411d-9155-4bc61d08b46b" containerName="mariadb-database-create" Jan 31 04:50:28 crc kubenswrapper[4812]: E0131 04:50:28.668241 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="6ae1265c-41c1-4871-998d-fd3f17598b8e" containerName="mariadb-account-create-update" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.668255 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="6ae1265c-41c1-4871-998d-fd3f17598b8e" containerName="mariadb-account-create-update" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.668474 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="6ae1265c-41c1-4871-998d-fd3f17598b8e" containerName="mariadb-account-create-update" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.668499 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bf00dafa-494f-411d-9155-4bc61d08b46b" containerName="mariadb-database-create" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.669213 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.671192 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-427mq" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.671239 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-config-data" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.681911 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g4hqp"] Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.824927 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-config-data\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.825305 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6ckcn\" (UniqueName: \"kubernetes.io/projected/24978441-de4c-4bbf-8eb9-289c7a56df4d-kube-api-access-6ckcn\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.825438 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-db-sync-config-data\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.926476 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-6ckcn\" (UniqueName: \"kubernetes.io/projected/24978441-de4c-4bbf-8eb9-289c7a56df4d-kube-api-access-6ckcn\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.926555 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-db-sync-config-data\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.926589 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-config-data\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.930708 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-config-data\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.934978 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-db-sync-config-data\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:28 crc kubenswrapper[4812]: I0131 04:50:28.949615 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-6ckcn\" (UniqueName: \"kubernetes.io/projected/24978441-de4c-4bbf-8eb9-289c7a56df4d-kube-api-access-6ckcn\") pod \"glance-db-sync-g4hqp\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:29 crc kubenswrapper[4812]: I0131 04:50:29.035177 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:29 crc kubenswrapper[4812]: I0131 04:50:29.480074 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g4hqp"] Jan 31 04:50:30 crc kubenswrapper[4812]: I0131 04:50:30.436319 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g4hqp" event={"ID":"24978441-de4c-4bbf-8eb9-289c7a56df4d","Type":"ContainerStarted","Data":"344fa82c4dd6d056f4281c708cb818888da155a569fa80ba48c3d91457f3dc40"} Jan 31 04:50:30 crc kubenswrapper[4812]: I0131 04:50:30.436658 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g4hqp" event={"ID":"24978441-de4c-4bbf-8eb9-289c7a56df4d","Type":"ContainerStarted","Data":"0a993295e7864f479203dbb0092425e04df9ee2dc57c624b4834e359d2c23233"} Jan 31 04:50:30 crc kubenswrapper[4812]: I0131 04:50:30.458070 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-db-sync-g4hqp" podStartSLOduration=2.458056515 podStartE2EDuration="2.458056515s" podCreationTimestamp="2026-01-31 04:50:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:30.454427868 +0000 UTC m=+1438.949449533" watchObservedRunningTime="2026-01-31 04:50:30.458056515 +0000 UTC m=+1438.953078180" Jan 31 04:50:33 crc kubenswrapper[4812]: I0131 04:50:33.468180 4812 generic.go:334] "Generic (PLEG): container finished" podID="24978441-de4c-4bbf-8eb9-289c7a56df4d" containerID="344fa82c4dd6d056f4281c708cb818888da155a569fa80ba48c3d91457f3dc40" exitCode=0 Jan 31 04:50:33 crc kubenswrapper[4812]: I0131 04:50:33.468296 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g4hqp" event={"ID":"24978441-de4c-4bbf-8eb9-289c7a56df4d","Type":"ContainerDied","Data":"344fa82c4dd6d056f4281c708cb818888da155a569fa80ba48c3d91457f3dc40"} Jan 31 04:50:34 crc kubenswrapper[4812]: I0131 04:50:34.859406 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.030460 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-config-data\") pod \"24978441-de4c-4bbf-8eb9-289c7a56df4d\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.030579 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6ckcn\" (UniqueName: \"kubernetes.io/projected/24978441-de4c-4bbf-8eb9-289c7a56df4d-kube-api-access-6ckcn\") pod \"24978441-de4c-4bbf-8eb9-289c7a56df4d\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.030645 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-db-sync-config-data\") pod \"24978441-de4c-4bbf-8eb9-289c7a56df4d\" (UID: \"24978441-de4c-4bbf-8eb9-289c7a56df4d\") " Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.036290 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/24978441-de4c-4bbf-8eb9-289c7a56df4d-kube-api-access-6ckcn" (OuterVolumeSpecName: "kube-api-access-6ckcn") pod "24978441-de4c-4bbf-8eb9-289c7a56df4d" (UID: "24978441-de4c-4bbf-8eb9-289c7a56df4d"). InnerVolumeSpecName "kube-api-access-6ckcn". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.039275 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-db-sync-config-data" (OuterVolumeSpecName: "db-sync-config-data") pod "24978441-de4c-4bbf-8eb9-289c7a56df4d" (UID: "24978441-de4c-4bbf-8eb9-289c7a56df4d"). InnerVolumeSpecName "db-sync-config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.099005 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-config-data" (OuterVolumeSpecName: "config-data") pod "24978441-de4c-4bbf-8eb9-289c7a56df4d" (UID: "24978441-de4c-4bbf-8eb9-289c7a56df4d"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.133226 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.133264 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6ckcn\" (UniqueName: \"kubernetes.io/projected/24978441-de4c-4bbf-8eb9-289c7a56df4d-kube-api-access-6ckcn\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.133280 4812 reconciler_common.go:293] "Volume detached for volume \"db-sync-config-data\" (UniqueName: \"kubernetes.io/secret/24978441-de4c-4bbf-8eb9-289c7a56df4d-db-sync-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.490154 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-db-sync-g4hqp" event={"ID":"24978441-de4c-4bbf-8eb9-289c7a56df4d","Type":"ContainerDied","Data":"0a993295e7864f479203dbb0092425e04df9ee2dc57c624b4834e359d2c23233"} Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.490518 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-db-sync-g4hqp" Jan 31 04:50:35 crc kubenswrapper[4812]: I0131 04:50:35.490542 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0a993295e7864f479203dbb0092425e04df9ee2dc57c624b4834e359d2c23233" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.805369 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:50:36 crc kubenswrapper[4812]: E0131 04:50:36.806423 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="24978441-de4c-4bbf-8eb9-289c7a56df4d" containerName="glance-db-sync" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.806490 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="24978441-de4c-4bbf-8eb9-289c7a56df4d" containerName="glance-db-sync" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.806685 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="24978441-de4c-4bbf-8eb9-289c7a56df4d" containerName="glance-db-sync" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.807408 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.813186 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-scripts" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.813366 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-glance-dockercfg-427mq" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.813429 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-external-config-data" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.835394 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.958377 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959052 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2www\" (UniqueName: \"kubernetes.io/projected/61306405-d45a-4632-b239-6feec523a983-kube-api-access-t2www\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959243 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-scripts\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959388 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959522 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-run\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959640 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959777 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-dev\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.959922 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-logs\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.960046 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.960178 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-sys\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.960297 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.960431 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-config-data\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.960541 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.960668 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.988859 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:36 crc kubenswrapper[4812]: I0131 04:50:36.990063 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.006385 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.055181 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.056480 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062274 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-t2www\" (UniqueName: \"kubernetes.io/projected/61306405-d45a-4632-b239-6feec523a983-kube-api-access-t2www\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062319 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062356 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062378 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-run\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062402 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062429 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062466 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-scripts\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062501 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062526 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062549 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-run\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062571 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062594 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062630 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-dev\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062659 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-logs\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062680 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062701 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062723 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-sys\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062747 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062767 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062788 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-sys\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062792 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062807 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-logs\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062906 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-llq97\" (UniqueName: \"kubernetes.io/projected/3b209907-eab8-4d9c-8766-3c2b8c52fc20-kube-api-access-llq97\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062936 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-config-data\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062955 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.062981 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063008 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063035 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063075 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-dev\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063565 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-var-locks-brick\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063621 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-run\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063919 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-logs\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063950 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") device mount path \"/mnt/openstack/pv03\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.063984 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-iscsi\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.064012 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-dev\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.064039 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-sys\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.064248 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-httpd-run\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.064289 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-lib-modules\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.064334 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-nvme\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.064415 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") device mount path \"/mnt/openstack/pv06\"" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.066545 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"glance-default-internal-config-data" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.072670 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-config-data\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.073560 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-scripts\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.085403 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.086050 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2www\" (UniqueName: \"kubernetes.io/projected/61306405-d45a-4632-b239-6feec523a983-kube-api-access-t2www\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.086701 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.106911 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.106911 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"glance-default-external-api-1\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.107465 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.126440 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164686 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-dev\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164723 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164741 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddggj\" (UniqueName: \"kubernetes.io/projected/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-kube-api-access-ddggj\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164763 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164829 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-sys\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164966 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-config-data\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.164995 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-dev\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165021 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165046 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165066 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165106 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165137 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165159 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165183 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165209 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165231 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-sys\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165252 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-logs\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165274 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165299 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-run\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165322 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-llq97\" (UniqueName: \"kubernetes.io/projected/3b209907-eab8-4d9c-8766-3c2b8c52fc20-kube-api-access-llq97\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165349 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165370 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-sys\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165389 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-logs\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165411 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165431 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-run\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165457 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165476 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-scripts\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165515 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165536 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-dev\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165561 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165619 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165642 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165665 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165694 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165717 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165759 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-run\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165782 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165802 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165822 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vnnwb\" (UniqueName: \"kubernetes.io/projected/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-kube-api-access-vnnwb\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165869 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165895 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.165914 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166050 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166087 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166574 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-dev\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166907 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-run\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166962 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166972 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.166997 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.167030 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-sys\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.167235 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-logs\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.167581 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.167605 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.170627 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-scripts\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.178083 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-config-data\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.195276 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-llq97\" (UniqueName: \"kubernetes.io/projected/3b209907-eab8-4d9c-8766-3c2b8c52fc20-kube-api-access-llq97\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.218012 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.227526 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.269513 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.269882 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.269912 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-dev\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.269959 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-dev\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.269990 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270010 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270026 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-ddggj\" (UniqueName: \"kubernetes.io/projected/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-kube-api-access-ddggj\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270045 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270062 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-sys\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270083 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-config-data\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270100 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-dev\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270118 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270140 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270164 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270183 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270205 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270223 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-run\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270248 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270265 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-sys\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270280 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-logs\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270289 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270809 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-sys\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270878 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.271344 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.271628 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-dev\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.271976 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.271998 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-sys\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.272101 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.272178 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.272209 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.272658 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-logs\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.270296 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-run\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.273746 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.273779 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-run\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274059 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274106 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-scripts\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274175 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274207 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274245 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274270 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274325 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274372 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274401 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vnnwb\" (UniqueName: \"kubernetes.io/projected/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-kube-api-access-vnnwb\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274439 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274554 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.274642 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.275156 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.275466 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.279117 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-scripts\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.280677 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-logs\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.280763 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-run\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.281081 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-config-data\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.288139 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddggj\" (UniqueName: \"kubernetes.io/projected/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-kube-api-access-ddggj\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.290032 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-scripts\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.292562 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-config-data\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.296681 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vnnwb\" (UniqueName: \"kubernetes.io/projected/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-kube-api-access-vnnwb\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.307282 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.308428 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.318891 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.321038 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.329610 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.416889 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:50:37 crc kubenswrapper[4812]: W0131 04:50:37.427149 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod61306405_d45a_4632_b239_6feec523a983.slice/crio-7c14fd01d218de2ecd47412731a4fd1f85ab17e5617461fc0efe0e5f11bde4d4 WatchSource:0}: Error finding container 7c14fd01d218de2ecd47412731a4fd1f85ab17e5617461fc0efe0e5f11bde4d4: Status 404 returned error can't find the container with id 7c14fd01d218de2ecd47412731a4fd1f85ab17e5617461fc0efe0e5f11bde4d4 Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.507433 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"61306405-d45a-4632-b239-6feec523a983","Type":"ContainerStarted","Data":"7c14fd01d218de2ecd47412731a4fd1f85ab17e5617461fc0efe0e5f11bde4d4"} Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.535344 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.557784 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.806270 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.813883 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:37 crc kubenswrapper[4812]: I0131 04:50:37.851017 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.147567 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.519432 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"61306405-d45a-4632-b239-6feec523a983","Type":"ContainerStarted","Data":"5407ec7c7fc3bcb6599983e4e650f15b7c769ae3126e25b64f2b25d2a92de51b"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.519859 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"61306405-d45a-4632-b239-6feec523a983","Type":"ContainerStarted","Data":"d54859e0fbe35dc69c2ce3fea3e550f34245238ca8e3ce52913295a2e134cedc"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.521672 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3b209907-eab8-4d9c-8766-3c2b8c52fc20","Type":"ContainerStarted","Data":"9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.521698 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3b209907-eab8-4d9c-8766-3c2b8c52fc20","Type":"ContainerStarted","Data":"50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.521707 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3b209907-eab8-4d9c-8766-3c2b8c52fc20","Type":"ContainerStarted","Data":"f81a372755b9f09388c7056f63a704d28b203768d79d971cfe2dbe9c3e31a0e2"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.524425 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8aa6a4a-4eed-40c4-9506-5d519d484ea3","Type":"ContainerStarted","Data":"b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.524489 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8aa6a4a-4eed-40c4-9506-5d519d484ea3","Type":"ContainerStarted","Data":"cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.524509 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8aa6a4a-4eed-40c4-9506-5d519d484ea3","Type":"ContainerStarted","Data":"670b8931f88b5aa931ee34525fe44c2203015cb17cc56504fc8168f9a3f82402"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.526682 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"def75e66-7828-4ccf-9a8f-9b0a8ef6f767","Type":"ContainerStarted","Data":"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.526715 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"def75e66-7828-4ccf-9a8f-9b0a8ef6f767","Type":"ContainerStarted","Data":"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.526727 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"def75e66-7828-4ccf-9a8f-9b0a8ef6f767","Type":"ContainerStarted","Data":"e7c32f72e1151f4b60963ed554d1b8c14c2d6de4db9101d80637bbb7f2d3e547"} Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.526773 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-log" containerID="cri-o://5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb" gracePeriod=30 Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.526825 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-httpd" containerID="cri-o://958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9" gracePeriod=30 Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.553731 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-1" podStartSLOduration=2.553709995 podStartE2EDuration="2.553709995s" podCreationTimestamp="2026-01-31 04:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:38.545008223 +0000 UTC m=+1447.040029898" watchObservedRunningTime="2026-01-31 04:50:38.553709995 +0000 UTC m=+1447.048731660" Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.578536 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=2.578521855 podStartE2EDuration="2.578521855s" podCreationTimestamp="2026-01-31 04:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:38.575592537 +0000 UTC m=+1447.070614202" watchObservedRunningTime="2026-01-31 04:50:38.578521855 +0000 UTC m=+1447.073543520" Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.611720 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=3.6117051780000002 podStartE2EDuration="3.611705178s" podCreationTimestamp="2026-01-31 04:50:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:38.602486863 +0000 UTC m=+1447.097508528" watchObservedRunningTime="2026-01-31 04:50:38.611705178 +0000 UTC m=+1447.106726843" Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.635602 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.635586023 podStartE2EDuration="2.635586023s" podCreationTimestamp="2026-01-31 04:50:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:38.630388154 +0000 UTC m=+1447.125409829" watchObservedRunningTime="2026-01-31 04:50:38.635586023 +0000 UTC m=+1447.130607688" Jan 31 04:50:38 crc kubenswrapper[4812]: I0131 04:50:38.918533 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.019492 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-lib-modules\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.019933 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-scripts\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.019970 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.019645 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020013 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-var-locks-brick\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020045 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-iscsi\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020089 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020143 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vnnwb\" (UniqueName: \"kubernetes.io/projected/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-kube-api-access-vnnwb\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020160 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020185 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-sys\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020191 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020221 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-logs\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020280 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-config-data\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020317 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-nvme\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020347 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-dev\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020287 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-sys" (OuterVolumeSpecName: "sys") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020393 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-httpd-run\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020435 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-run\") pod \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\" (UID: \"def75e66-7828-4ccf-9a8f-9b0a8ef6f767\") " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020664 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-logs" (OuterVolumeSpecName: "logs") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020710 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-dev" (OuterVolumeSpecName: "dev") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020742 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.020954 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-run" (OuterVolumeSpecName: "run") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021075 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021173 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021192 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021206 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021217 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021228 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021239 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021250 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021260 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.021270 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.025981 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.025991 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-kube-api-access-vnnwb" (OuterVolumeSpecName: "kube-api-access-vnnwb") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "kube-api-access-vnnwb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.026105 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-scripts" (OuterVolumeSpecName: "scripts") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.026110 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.084399 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-config-data" (OuterVolumeSpecName: "config-data") pod "def75e66-7828-4ccf-9a8f-9b0a8ef6f767" (UID: "def75e66-7828-4ccf-9a8f-9b0a8ef6f767"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.123071 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.123114 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vnnwb\" (UniqueName: \"kubernetes.io/projected/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-kube-api-access-vnnwb\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.123236 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.123249 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/def75e66-7828-4ccf-9a8f-9b0a8ef6f767-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.123276 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.147204 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.155076 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.225198 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.225224 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538142 4812 generic.go:334] "Generic (PLEG): container finished" podID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerID="958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9" exitCode=143 Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538482 4812 generic.go:334] "Generic (PLEG): container finished" podID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerID="5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb" exitCode=143 Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538278 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538178 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"def75e66-7828-4ccf-9a8f-9b0a8ef6f767","Type":"ContainerDied","Data":"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9"} Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538594 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"def75e66-7828-4ccf-9a8f-9b0a8ef6f767","Type":"ContainerDied","Data":"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb"} Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538620 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"def75e66-7828-4ccf-9a8f-9b0a8ef6f767","Type":"ContainerDied","Data":"e7c32f72e1151f4b60963ed554d1b8c14c2d6de4db9101d80637bbb7f2d3e547"} Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.538641 4812 scope.go:117] "RemoveContainer" containerID="958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.576730 4812 scope.go:117] "RemoveContainer" containerID="5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.585095 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.591617 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.600400 4812 scope.go:117] "RemoveContainer" containerID="958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9" Jan 31 04:50:39 crc kubenswrapper[4812]: E0131 04:50:39.600744 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9\": container with ID starting with 958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9 not found: ID does not exist" containerID="958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.600795 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9"} err="failed to get container status \"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9\": rpc error: code = NotFound desc = could not find container \"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9\": container with ID starting with 958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9 not found: ID does not exist" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.600829 4812 scope.go:117] "RemoveContainer" containerID="5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb" Jan 31 04:50:39 crc kubenswrapper[4812]: E0131 04:50:39.601049 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb\": container with ID starting with 5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb not found: ID does not exist" containerID="5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.601072 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb"} err="failed to get container status \"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb\": rpc error: code = NotFound desc = could not find container \"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb\": container with ID starting with 5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb not found: ID does not exist" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.601086 4812 scope.go:117] "RemoveContainer" containerID="958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.601299 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9"} err="failed to get container status \"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9\": rpc error: code = NotFound desc = could not find container \"958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9\": container with ID starting with 958b9dc4e39c5cc082ddc72cac0b2dacb3b2a5c328bc7c63e1673fd1a29150f9 not found: ID does not exist" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.601326 4812 scope.go:117] "RemoveContainer" containerID="5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.601546 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb"} err="failed to get container status \"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb\": rpc error: code = NotFound desc = could not find container \"5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb\": container with ID starting with 5e0a4afdcb0e6a260d1b6e81e4dd9e2bdd060f7acb1d15aac08b21358ffc3ecb not found: ID does not exist" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.620055 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:39 crc kubenswrapper[4812]: E0131 04:50:39.620389 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-log" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.620407 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-log" Jan 31 04:50:39 crc kubenswrapper[4812]: E0131 04:50:39.620417 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-httpd" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.620424 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-httpd" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.620577 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-log" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.620593 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" containerName="glance-httpd" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.621349 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.633734 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.731832 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.833783 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.833888 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.833957 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-config-data\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834009 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834035 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-dev\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834060 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-run\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834115 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834144 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-scripts\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834165 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834288 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") device mount path \"/mnt/openstack/pv13\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834316 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834462 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-sys\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834504 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834534 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/83b00f57-a375-47cd-ae26-86353c3ccf61-kube-api-access-qzh8f\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.834568 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-logs\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.853676 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.935875 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.935949 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936000 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-sys\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936024 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-lib-modules\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936063 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-iscsi\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936037 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936144 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/83b00f57-a375-47cd-ae26-86353c3ccf61-kube-api-access-qzh8f\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936136 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-sys\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936158 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") device mount path \"/mnt/openstack/pv18\"" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936466 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-logs\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936530 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936630 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-var-locks-brick\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936662 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-config-data\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936747 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936779 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-dev\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936803 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-run\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936859 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936876 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-scripts\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936876 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-dev\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936897 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-run\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.936924 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-logs\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.937003 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-nvme\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.937130 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-httpd-run\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.940613 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-scripts\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.940807 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-config-data\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.956219 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:39 crc kubenswrapper[4812]: I0131 04:50:39.975198 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/83b00f57-a375-47cd-ae26-86353c3ccf61-kube-api-access-qzh8f\") pod \"glance-default-internal-api-1\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:40 crc kubenswrapper[4812]: I0131 04:50:40.236690 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:40 crc kubenswrapper[4812]: I0131 04:50:40.351513 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="def75e66-7828-4ccf-9a8f-9b0a8ef6f767" path="/var/lib/kubelet/pods/def75e66-7828-4ccf-9a8f-9b0a8ef6f767/volumes" Jan 31 04:50:40 crc kubenswrapper[4812]: I0131 04:50:40.754420 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:50:40 crc kubenswrapper[4812]: W0131 04:50:40.758357 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod83b00f57_a375_47cd_ae26_86353c3ccf61.slice/crio-0657f0a1f09551d114964b1bd6a59181475a89012952621141e314d7281f0548 WatchSource:0}: Error finding container 0657f0a1f09551d114964b1bd6a59181475a89012952621141e314d7281f0548: Status 404 returned error can't find the container with id 0657f0a1f09551d114964b1bd6a59181475a89012952621141e314d7281f0548 Jan 31 04:50:41 crc kubenswrapper[4812]: I0131 04:50:41.562894 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"83b00f57-a375-47cd-ae26-86353c3ccf61","Type":"ContainerStarted","Data":"a7f7035d5ffcca63f5f6e955417650aa325575108fa642738e59e4246359dc7b"} Jan 31 04:50:41 crc kubenswrapper[4812]: I0131 04:50:41.563270 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"83b00f57-a375-47cd-ae26-86353c3ccf61","Type":"ContainerStarted","Data":"76f04d53de11321b69821bfacf8584ebc9fcbf701e16c36b26ad3627885d0549"} Jan 31 04:50:41 crc kubenswrapper[4812]: I0131 04:50:41.563291 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"83b00f57-a375-47cd-ae26-86353c3ccf61","Type":"ContainerStarted","Data":"0657f0a1f09551d114964b1bd6a59181475a89012952621141e314d7281f0548"} Jan 31 04:50:41 crc kubenswrapper[4812]: I0131 04:50:41.591477 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-1" podStartSLOduration=2.591461441 podStartE2EDuration="2.591461441s" podCreationTimestamp="2026-01-31 04:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:41.589041838 +0000 UTC m=+1450.084063503" watchObservedRunningTime="2026-01-31 04:50:41.591461441 +0000 UTC m=+1450.086483106" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.127569 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.128231 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.161628 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.176961 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.321522 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.321596 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.356983 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.366952 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.536098 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.537343 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.580218 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.606909 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.638062 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.638113 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.638132 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.638150 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.638166 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:47 crc kubenswrapper[4812]: I0131 04:50:47.638182 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.377202 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.380153 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.513175 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.516338 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.585963 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.679820 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.679972 4812 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Jan 31 04:50:49 crc kubenswrapper[4812]: I0131 04:50:49.682786 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.237260 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.239248 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.281125 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.283454 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.662117 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-log" containerID="cri-o://50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5" gracePeriod=30 Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.662353 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-httpd" containerID="cri-o://9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b" gracePeriod=30 Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.663446 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.664877 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.670178 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.144:9292/healthcheck\": EOF" Jan 31 04:50:50 crc kubenswrapper[4812]: I0131 04:50:50.670304 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.144:9292/healthcheck\": EOF" Jan 31 04:50:51 crc kubenswrapper[4812]: I0131 04:50:51.676342 4812 generic.go:334] "Generic (PLEG): container finished" podID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerID="50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5" exitCode=143 Jan 31 04:50:51 crc kubenswrapper[4812]: I0131 04:50:51.678187 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3b209907-eab8-4d9c-8766-3c2b8c52fc20","Type":"ContainerDied","Data":"50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5"} Jan 31 04:50:52 crc kubenswrapper[4812]: I0131 04:50:52.512190 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:52 crc kubenswrapper[4812]: I0131 04:50:52.589309 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:50:52 crc kubenswrapper[4812]: I0131 04:50:52.643876 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:52 crc kubenswrapper[4812]: I0131 04:50:52.644208 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-log" containerID="cri-o://cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941" gracePeriod=30 Jan 31 04:50:52 crc kubenswrapper[4812]: I0131 04:50:52.644393 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-httpd" containerID="cri-o://b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178" gracePeriod=30 Jan 31 04:50:53 crc kubenswrapper[4812]: I0131 04:50:53.703335 4812 generic.go:334] "Generic (PLEG): container finished" podID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerID="cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941" exitCode=143 Jan 31 04:50:53 crc kubenswrapper[4812]: I0131 04:50:53.703556 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8aa6a4a-4eed-40c4-9506-5d519d484ea3","Type":"ContainerDied","Data":"cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941"} Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.488771 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.597933 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-config-data\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598062 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-run\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598154 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-run" (OuterVolumeSpecName: "run") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598208 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-iscsi\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598237 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-dev\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598316 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-lib-modules\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598336 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-var-locks-brick\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598366 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598384 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-sys\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598410 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-logs\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598438 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-nvme\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598473 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-httpd-run\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598495 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598532 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-llq97\" (UniqueName: \"kubernetes.io/projected/3b209907-eab8-4d9c-8766-3c2b8c52fc20-kube-api-access-llq97\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598559 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-scripts\") pod \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\" (UID: \"3b209907-eab8-4d9c-8766-3c2b8c52fc20\") " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598800 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-sys" (OuterVolumeSpecName: "sys") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598962 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.598985 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.599025 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.599052 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-dev" (OuterVolumeSpecName: "dev") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.599075 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.599097 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.603724 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.603983 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-logs" (OuterVolumeSpecName: "logs") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.604014 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.615997 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3b209907-eab8-4d9c-8766-3c2b8c52fc20-kube-api-access-llq97" (OuterVolumeSpecName: "kube-api-access-llq97") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "kube-api-access-llq97". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.622081 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-scripts" (OuterVolumeSpecName: "scripts") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.622204 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.637030 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.688249 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-config-data" (OuterVolumeSpecName: "config-data") pod "3b209907-eab8-4d9c-8766-3c2b8c52fc20" (UID: "3b209907-eab8-4d9c-8766-3c2b8c52fc20"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700330 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700363 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700395 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700405 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700414 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700424 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/3b209907-eab8-4d9c-8766-3c2b8c52fc20-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700438 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700449 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-llq97\" (UniqueName: \"kubernetes.io/projected/3b209907-eab8-4d9c-8766-3c2b8c52fc20-kube-api-access-llq97\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700457 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700468 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/3b209907-eab8-4d9c-8766-3c2b8c52fc20-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700477 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.700484 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/3b209907-eab8-4d9c-8766-3c2b8c52fc20-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.712750 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.715258 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.726403 4812 generic.go:334] "Generic (PLEG): container finished" podID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerID="9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b" exitCode=0 Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.726454 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3b209907-eab8-4d9c-8766-3c2b8c52fc20","Type":"ContainerDied","Data":"9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b"} Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.726495 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"3b209907-eab8-4d9c-8766-3c2b8c52fc20","Type":"ContainerDied","Data":"f81a372755b9f09388c7056f63a704d28b203768d79d971cfe2dbe9c3e31a0e2"} Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.726502 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.726519 4812 scope.go:117] "RemoveContainer" containerID="9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.749039 4812 scope.go:117] "RemoveContainer" containerID="50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.763176 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.768709 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.774274 4812 scope.go:117] "RemoveContainer" containerID="9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b" Jan 31 04:50:55 crc kubenswrapper[4812]: E0131 04:50:55.774762 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b\": container with ID starting with 9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b not found: ID does not exist" containerID="9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.774804 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b"} err="failed to get container status \"9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b\": rpc error: code = NotFound desc = could not find container \"9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b\": container with ID starting with 9612d42bcd6e3ec18fcd6cbff709c44c7a2c80bf9cb1850450cdbd61c7ede49b not found: ID does not exist" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.774830 4812 scope.go:117] "RemoveContainer" containerID="50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5" Jan 31 04:50:55 crc kubenswrapper[4812]: E0131 04:50:55.775347 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5\": container with ID starting with 50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5 not found: ID does not exist" containerID="50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.775391 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5"} err="failed to get container status \"50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5\": rpc error: code = NotFound desc = could not find container \"50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5\": container with ID starting with 50857ad64e06bf7b500ad9290906a7694c90644f8805d7b80119bd05c6d77ee5 not found: ID does not exist" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.786232 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:55 crc kubenswrapper[4812]: E0131 04:50:55.786644 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-httpd" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.786689 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-httpd" Jan 31 04:50:55 crc kubenswrapper[4812]: E0131 04:50:55.786737 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-log" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.786744 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-log" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.786938 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-httpd" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.786974 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" containerName="glance-log" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.787826 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.794034 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.801597 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.801622 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.905705 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-logs\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.905771 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.905800 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-run\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.905870 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.905918 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906036 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906061 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906084 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906107 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906245 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906304 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906403 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-dev\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906526 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-sys\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:55 crc kubenswrapper[4812]: I0131 04:50:55.906543 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-szkx4\" (UniqueName: \"kubernetes.io/projected/2c14c489-8003-4c2e-a48f-c0ce57855f30-kube-api-access-szkx4\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008321 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008380 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-dev\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008407 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-sys\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008427 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-szkx4\" (UniqueName: \"kubernetes.io/projected/2c14c489-8003-4c2e-a48f-c0ce57855f30-kube-api-access-szkx4\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008460 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-logs\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008482 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008499 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-run\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008514 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008538 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008562 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008579 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008595 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008612 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008638 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008800 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-run\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.008904 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-sys\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009034 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-dev\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009048 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-httpd-run\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009080 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-lib-modules\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009129 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-nvme\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009087 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-iscsi\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009147 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") device mount path \"/mnt/openstack/pv01\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009489 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-logs\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.009587 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-var-locks-brick\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.011076 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") device mount path \"/mnt/openstack/pv05\"" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.016031 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-config-data\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.016546 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-scripts\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.026415 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-szkx4\" (UniqueName: \"kubernetes.io/projected/2c14c489-8003-4c2e-a48f-c0ce57855f30-kube-api-access-szkx4\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.034266 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.038106 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"glance-default-external-api-0\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.102242 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.209995 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.312961 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-logs\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313022 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-iscsi\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313070 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ddggj\" (UniqueName: \"kubernetes.io/projected/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-kube-api-access-ddggj\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313093 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-dev\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313120 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313138 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-sys\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313182 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313198 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-dev" (OuterVolumeSpecName: "dev") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313198 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-scripts\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313271 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313290 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-sys" (OuterVolumeSpecName: "sys") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313329 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-run\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313360 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-nvme\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313389 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-lib-modules\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313457 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-httpd-run\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313450 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-logs" (OuterVolumeSpecName: "logs") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313449 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-run" (OuterVolumeSpecName: "run") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313486 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-config-data\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313503 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-var-locks-brick\") pod \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\" (UID: \"e8aa6a4a-4eed-40c4-9506-5d519d484ea3\") " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313504 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313514 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313749 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.313729 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314117 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314132 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314143 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314152 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314161 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314170 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314178 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314186 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.314194 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.316186 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.316863 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.316995 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-kube-api-access-ddggj" (OuterVolumeSpecName: "kube-api-access-ddggj") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "kube-api-access-ddggj". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.317552 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-scripts" (OuterVolumeSpecName: "scripts") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.349143 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3b209907-eab8-4d9c-8766-3c2b8c52fc20" path="/var/lib/kubelet/pods/3b209907-eab8-4d9c-8766-3c2b8c52fc20/volumes" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.363498 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-config-data" (OuterVolumeSpecName: "config-data") pod "e8aa6a4a-4eed-40c4-9506-5d519d484ea3" (UID: "e8aa6a4a-4eed-40c4-9506-5d519d484ea3"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.402358 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.415557 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.415587 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-ddggj\" (UniqueName: \"kubernetes.io/projected/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-kube-api-access-ddggj\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.415609 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.415618 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/e8aa6a4a-4eed-40c4-9506-5d519d484ea3-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.415630 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.429819 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.439559 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.516731 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.516930 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.736562 4812 generic.go:334] "Generic (PLEG): container finished" podID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerID="b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178" exitCode=0 Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.736611 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.736655 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8aa6a4a-4eed-40c4-9506-5d519d484ea3","Type":"ContainerDied","Data":"b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178"} Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.736718 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"e8aa6a4a-4eed-40c4-9506-5d519d484ea3","Type":"ContainerDied","Data":"670b8931f88b5aa931ee34525fe44c2203015cb17cc56504fc8168f9a3f82402"} Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.736744 4812 scope.go:117] "RemoveContainer" containerID="b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.738282 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"2c14c489-8003-4c2e-a48f-c0ce57855f30","Type":"ContainerStarted","Data":"8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee"} Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.738312 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"2c14c489-8003-4c2e-a48f-c0ce57855f30","Type":"ContainerStarted","Data":"51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612"} Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.738325 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"2c14c489-8003-4c2e-a48f-c0ce57855f30","Type":"ContainerStarted","Data":"cbc658d76c0c8e85c44de8c5578b9617bf6d1ad94ba072b366d2f393a5cedc97"} Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.763503 4812 scope.go:117] "RemoveContainer" containerID="cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.772364 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-external-api-0" podStartSLOduration=1.7723344939999999 podStartE2EDuration="1.772334494s" podCreationTimestamp="2026-01-31 04:50:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:56.765748469 +0000 UTC m=+1465.260770154" watchObservedRunningTime="2026-01-31 04:50:56.772334494 +0000 UTC m=+1465.267356179" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.798186 4812 scope.go:117] "RemoveContainer" containerID="b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178" Jan 31 04:50:56 crc kubenswrapper[4812]: E0131 04:50:56.800597 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178\": container with ID starting with b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178 not found: ID does not exist" containerID="b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.800646 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178"} err="failed to get container status \"b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178\": rpc error: code = NotFound desc = could not find container \"b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178\": container with ID starting with b27ebf9f34152b87bf64e7c03ebfa0d14b0e849b49560fbc50fa7a288c1f3178 not found: ID does not exist" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.800671 4812 scope.go:117] "RemoveContainer" containerID="cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941" Jan 31 04:50:56 crc kubenswrapper[4812]: E0131 04:50:56.801131 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941\": container with ID starting with cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941 not found: ID does not exist" containerID="cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.801167 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941"} err="failed to get container status \"cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941\": rpc error: code = NotFound desc = could not find container \"cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941\": container with ID starting with cfbcca1c171a6044ac50bcc3a438af2ccaf8b53b7ca51c52a8c18f461f2b0941 not found: ID does not exist" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.824267 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.836091 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.846604 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:56 crc kubenswrapper[4812]: E0131 04:50:56.847160 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-log" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.847270 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-log" Jan 31 04:50:56 crc kubenswrapper[4812]: E0131 04:50:56.847333 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-httpd" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.847385 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-httpd" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.847584 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-log" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.847675 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" containerName="glance-httpd" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.848516 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.852628 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.927648 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928081 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928174 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928272 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-dev\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928349 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928419 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928526 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928617 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928708 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928794 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-sys\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928888 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-run\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.928979 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:56 crc kubenswrapper[4812]: I0131 04:50:56.929062 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030009 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-sys\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030297 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-run\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030407 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030511 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030621 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s56f8\" (UniqueName: \"kubernetes.io/projected/c3f36241-e6de-4a61-8f71-5334a7e079f4-kube-api-access-s56f8\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030368 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-run\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030559 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-nvme\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030148 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-sys\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030895 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030944 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.030971 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031010 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-dev\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031085 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031106 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031142 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031184 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031215 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031316 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-lib-modules\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031335 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-httpd-run\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031372 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-var-locks-brick\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031503 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") device mount path \"/mnt/openstack/pv14\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.031665 4812 operation_generator.go:580] "MountVolume.MountDevice succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") device mount path \"/mnt/openstack/pv19\"" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.034577 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-dev\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.034600 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-iscsi\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.034664 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-logs\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.046571 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-scripts\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.046877 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-config-data\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.049826 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.052204 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.132291 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-s56f8\" (UniqueName: \"kubernetes.io/projected/c3f36241-e6de-4a61-8f71-5334a7e079f4-kube-api-access-s56f8\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.153138 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-s56f8\" (UniqueName: \"kubernetes.io/projected/c3f36241-e6de-4a61-8f71-5334a7e079f4-kube-api-access-s56f8\") pod \"glance-default-internal-api-0\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.172052 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.580240 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:50:57 crc kubenswrapper[4812]: W0131 04:50:57.588355 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podc3f36241_e6de_4a61_8f71_5334a7e079f4.slice/crio-b7463e5229035dc905e8e6500a7e3c5dd6ccc0271a1cd2e15e4e0ec35262810a WatchSource:0}: Error finding container b7463e5229035dc905e8e6500a7e3c5dd6ccc0271a1cd2e15e4e0ec35262810a: Status 404 returned error can't find the container with id b7463e5229035dc905e8e6500a7e3c5dd6ccc0271a1cd2e15e4e0ec35262810a Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.752477 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c3f36241-e6de-4a61-8f71-5334a7e079f4","Type":"ContainerStarted","Data":"6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6"} Jan 31 04:50:57 crc kubenswrapper[4812]: I0131 04:50:57.752704 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c3f36241-e6de-4a61-8f71-5334a7e079f4","Type":"ContainerStarted","Data":"b7463e5229035dc905e8e6500a7e3c5dd6ccc0271a1cd2e15e4e0ec35262810a"} Jan 31 04:50:58 crc kubenswrapper[4812]: I0131 04:50:58.353889 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e8aa6a4a-4eed-40c4-9506-5d519d484ea3" path="/var/lib/kubelet/pods/e8aa6a4a-4eed-40c4-9506-5d519d484ea3/volumes" Jan 31 04:50:58 crc kubenswrapper[4812]: I0131 04:50:58.765111 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c3f36241-e6de-4a61-8f71-5334a7e079f4","Type":"ContainerStarted","Data":"cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94"} Jan 31 04:50:58 crc kubenswrapper[4812]: I0131 04:50:58.795110 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glance-default-internal-api-0" podStartSLOduration=2.7950808 podStartE2EDuration="2.7950808s" podCreationTimestamp="2026-01-31 04:50:56 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:50:58.787454868 +0000 UTC m=+1467.282476563" watchObservedRunningTime="2026-01-31 04:50:58.7950808 +0000 UTC m=+1467.290102485" Jan 31 04:51:06 crc kubenswrapper[4812]: I0131 04:51:06.102623 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:06 crc kubenswrapper[4812]: I0131 04:51:06.103159 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:06 crc kubenswrapper[4812]: I0131 04:51:06.137387 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:06 crc kubenswrapper[4812]: I0131 04:51:06.174877 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:06 crc kubenswrapper[4812]: I0131 04:51:06.863322 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:06 crc kubenswrapper[4812]: I0131 04:51:06.863381 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:07 crc kubenswrapper[4812]: I0131 04:51:07.173294 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:07 crc kubenswrapper[4812]: I0131 04:51:07.173607 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:07 crc kubenswrapper[4812]: I0131 04:51:07.194853 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:07 crc kubenswrapper[4812]: I0131 04:51:07.220592 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:07 crc kubenswrapper[4812]: I0131 04:51:07.871020 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:07 crc kubenswrapper[4812]: I0131 04:51:07.871059 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:08 crc kubenswrapper[4812]: I0131 04:51:08.776709 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:08 crc kubenswrapper[4812]: I0131 04:51:08.778694 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:09 crc kubenswrapper[4812]: I0131 04:51:09.650527 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:09 crc kubenswrapper[4812]: I0131 04:51:09.718693 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:41 crc kubenswrapper[4812]: I0131 04:51:41.324200 4812 scope.go:117] "RemoveContainer" containerID="26873c4d9a3014a080280bd37de48a3d706060c92eb74b2c763c03ae7385eb69" Jan 31 04:51:41 crc kubenswrapper[4812]: I0131 04:51:41.358407 4812 scope.go:117] "RemoveContainer" containerID="bfbd93efb73a65ac1947c8d486ed67493c52be9dbd3c1737d4ae7f15934ab952" Jan 31 04:51:44 crc kubenswrapper[4812]: I0131 04:51:44.337884 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:51:44 crc kubenswrapper[4812]: I0131 04:51:44.338272 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:51:52 crc kubenswrapper[4812]: I0131 04:51:52.940264 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:51:52 crc kubenswrapper[4812]: I0131 04:51:52.951232 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-log" containerID="cri-o://d54859e0fbe35dc69c2ce3fea3e550f34245238ca8e3ce52913295a2e134cedc" gracePeriod=30 Jan 31 04:51:52 crc kubenswrapper[4812]: I0131 04:51:52.951712 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-1" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-httpd" containerID="cri-o://5407ec7c7fc3bcb6599983e4e650f15b7c769ae3126e25b64f2b25d2a92de51b" gracePeriod=30 Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.119160 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.119468 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-log" containerID="cri-o://76f04d53de11321b69821bfacf8584ebc9fcbf701e16c36b26ad3627885d0549" gracePeriod=30 Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.119557 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-1" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-httpd" containerID="cri-o://a7f7035d5ffcca63f5f6e955417650aa325575108fa642738e59e4246359dc7b" gracePeriod=30 Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.354061 4812 generic.go:334] "Generic (PLEG): container finished" podID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerID="76f04d53de11321b69821bfacf8584ebc9fcbf701e16c36b26ad3627885d0549" exitCode=143 Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.354147 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"83b00f57-a375-47cd-ae26-86353c3ccf61","Type":"ContainerDied","Data":"76f04d53de11321b69821bfacf8584ebc9fcbf701e16c36b26ad3627885d0549"} Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.355420 4812 generic.go:334] "Generic (PLEG): container finished" podID="61306405-d45a-4632-b239-6feec523a983" containerID="d54859e0fbe35dc69c2ce3fea3e550f34245238ca8e3ce52913295a2e134cedc" exitCode=143 Jan 31 04:51:53 crc kubenswrapper[4812]: I0131 04:51:53.355458 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"61306405-d45a-4632-b239-6feec523a983","Type":"ContainerDied","Data":"d54859e0fbe35dc69c2ce3fea3e550f34245238ca8e3ce52913295a2e134cedc"} Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.357113 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g4hqp"] Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.357387 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-sync-g4hqp"] Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.390160 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/glancea2e9-account-delete-jt4d7"] Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.391337 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.403999 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea2e9-account-delete-jt4d7"] Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.440615 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.440871 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-log" containerID="cri-o://51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612" gracePeriod=30 Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.440911 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-httpd" containerID="cri-o://8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee" gracePeriod=30 Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.523910 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.524173 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-log" containerID="cri-o://6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6" gracePeriod=30 Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.524519 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-httpd" containerID="cri-o://cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94" gracePeriod=30 Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.535354 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100166d1-b18c-4f2c-80e1-3dabfad2b999-operator-scripts\") pod \"glancea2e9-account-delete-jt4d7\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.535450 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vhzn4\" (UniqueName: \"kubernetes.io/projected/100166d1-b18c-4f2c-80e1-3dabfad2b999-kube-api-access-vhzn4\") pod \"glancea2e9-account-delete-jt4d7\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.636369 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-vhzn4\" (UniqueName: \"kubernetes.io/projected/100166d1-b18c-4f2c-80e1-3dabfad2b999-kube-api-access-vhzn4\") pod \"glancea2e9-account-delete-jt4d7\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.636486 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100166d1-b18c-4f2c-80e1-3dabfad2b999-operator-scripts\") pod \"glancea2e9-account-delete-jt4d7\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.637209 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100166d1-b18c-4f2c-80e1-3dabfad2b999-operator-scripts\") pod \"glancea2e9-account-delete-jt4d7\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.657360 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-vhzn4\" (UniqueName: \"kubernetes.io/projected/100166d1-b18c-4f2c-80e1-3dabfad2b999-kube-api-access-vhzn4\") pod \"glancea2e9-account-delete-jt4d7\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:54 crc kubenswrapper[4812]: I0131 04:51:54.748052 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.157353 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/glancea2e9-account-delete-jt4d7"] Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.376055 4812 generic.go:334] "Generic (PLEG): container finished" podID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerID="51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612" exitCode=143 Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.376437 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"2c14c489-8003-4c2e-a48f-c0ce57855f30","Type":"ContainerDied","Data":"51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612"} Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.378830 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" event={"ID":"100166d1-b18c-4f2c-80e1-3dabfad2b999","Type":"ContainerStarted","Data":"d070d16b56cd0e6437b9a5ac7f3125f46083e750c6f5e1a070452d2f652d4e2e"} Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.378885 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" event={"ID":"100166d1-b18c-4f2c-80e1-3dabfad2b999","Type":"ContainerStarted","Data":"0cdaf00e817f7d34df8375089b25e89c51044b502934081eb8d4b1467f091c53"} Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.382877 4812 generic.go:334] "Generic (PLEG): container finished" podID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerID="6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6" exitCode=143 Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.382916 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c3f36241-e6de-4a61-8f71-5334a7e079f4","Type":"ContainerDied","Data":"6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6"} Jan 31 04:51:55 crc kubenswrapper[4812]: I0131 04:51:55.398805 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" podStartSLOduration=1.398787865 podStartE2EDuration="1.398787865s" podCreationTimestamp="2026-01-31 04:51:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:51:55.397714483 +0000 UTC m=+1523.892736148" watchObservedRunningTime="2026-01-31 04:51:55.398787865 +0000 UTC m=+1523.893809530" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.354715 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="24978441-de4c-4bbf-8eb9-289c7a56df4d" path="/var/lib/kubelet/pods/24978441-de4c-4bbf-8eb9-289c7a56df4d/volumes" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.394255 4812 generic.go:334] "Generic (PLEG): container finished" podID="61306405-d45a-4632-b239-6feec523a983" containerID="5407ec7c7fc3bcb6599983e4e650f15b7c769ae3126e25b64f2b25d2a92de51b" exitCode=0 Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.394407 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"61306405-d45a-4632-b239-6feec523a983","Type":"ContainerDied","Data":"5407ec7c7fc3bcb6599983e4e650f15b7c769ae3126e25b64f2b25d2a92de51b"} Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.396800 4812 generic.go:334] "Generic (PLEG): container finished" podID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerID="a7f7035d5ffcca63f5f6e955417650aa325575108fa642738e59e4246359dc7b" exitCode=0 Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.397148 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"83b00f57-a375-47cd-ae26-86353c3ccf61","Type":"ContainerDied","Data":"a7f7035d5ffcca63f5f6e955417650aa325575108fa642738e59e4246359dc7b"} Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.398931 4812 generic.go:334] "Generic (PLEG): container finished" podID="100166d1-b18c-4f2c-80e1-3dabfad2b999" containerID="d070d16b56cd0e6437b9a5ac7f3125f46083e750c6f5e1a070452d2f652d4e2e" exitCode=0 Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.398957 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" event={"ID":"100166d1-b18c-4f2c-80e1-3dabfad2b999","Type":"ContainerDied","Data":"d070d16b56cd0e6437b9a5ac7f3125f46083e750c6f5e1a070452d2f652d4e2e"} Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.570692 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.645000 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671299 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-dev\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671341 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-var-locks-brick\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671376 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-scripts\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671446 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-run\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671458 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-nvme\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671492 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671522 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-logs\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671550 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-httpd-run\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671567 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-config-data\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671595 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-iscsi\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671619 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-sys\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671647 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2www\" (UniqueName: \"kubernetes.io/projected/61306405-d45a-4632-b239-6feec523a983-kube-api-access-t2www\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671669 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.671695 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-lib-modules\") pod \"61306405-d45a-4632-b239-6feec523a983\" (UID: \"61306405-d45a-4632-b239-6feec523a983\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.672022 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.672056 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-dev" (OuterVolumeSpecName: "dev") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.672072 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.673032 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.673081 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-run" (OuterVolumeSpecName: "run") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.673104 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.675033 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-sys" (OuterVolumeSpecName: "sys") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.677428 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-scripts" (OuterVolumeSpecName: "scripts") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.677996 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/61306405-d45a-4632-b239-6feec523a983-kube-api-access-t2www" (OuterVolumeSpecName: "kube-api-access-t2www") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "kube-api-access-t2www". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.678049 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.678347 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-logs" (OuterVolumeSpecName: "logs") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.682295 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage03-crc" (OuterVolumeSpecName: "glance") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "local-storage03-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.697283 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage06-crc" (OuterVolumeSpecName: "glance-cache") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "local-storage06-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.719520 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-config-data" (OuterVolumeSpecName: "config-data") pod "61306405-d45a-4632-b239-6feec523a983" (UID: "61306405-d45a-4632-b239-6feec523a983"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773274 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-dev\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773309 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773357 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-httpd-run\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773413 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/83b00f57-a375-47cd-ae26-86353c3ccf61-kube-api-access-qzh8f\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773440 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-sys\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773469 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-logs\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773492 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-config-data\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773526 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-nvme\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773582 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-scripts\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773599 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-iscsi\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773617 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-var-locks-brick\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773666 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773687 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-lib-modules\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773711 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-run\") pod \"83b00f57-a375-47cd-ae26-86353c3ccf61\" (UID: \"83b00f57-a375-47cd-ae26-86353c3ccf61\") " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773826 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-dev" (OuterVolumeSpecName: "dev") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773902 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773906 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.773945 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-run" (OuterVolumeSpecName: "run") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774017 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-sys" (OuterVolumeSpecName: "sys") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774017 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774106 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774143 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774433 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-logs" (OuterVolumeSpecName: "logs") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774653 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774720 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774732 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774765 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/61306405-d45a-4632-b239-6feec523a983-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774775 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774783 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774792 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774799 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774807 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774819 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-t2www\" (UniqueName: \"kubernetes.io/projected/61306405-d45a-4632-b239-6feec523a983-kube-api-access-t2www\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774883 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774895 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774930 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774941 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/83b00f57-a375-47cd-ae26-86353c3ccf61-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774951 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774962 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.774973 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.775008 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/61306405-d45a-4632-b239-6feec523a983-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.775018 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.775027 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.775036 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.775044 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/61306405-d45a-4632-b239-6feec523a983-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.775053 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/83b00f57-a375-47cd-ae26-86353c3ccf61-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.777505 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-scripts" (OuterVolumeSpecName: "scripts") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.778260 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83b00f57-a375-47cd-ae26-86353c3ccf61-kube-api-access-qzh8f" (OuterVolumeSpecName: "kube-api-access-qzh8f") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "kube-api-access-qzh8f". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.778398 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage18-crc" (OuterVolumeSpecName: "glance-cache") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "local-storage18-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.778530 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage13-crc" (OuterVolumeSpecName: "glance") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "local-storage13-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.789486 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage06-crc" (UniqueName: "kubernetes.io/local-volume/local-storage06-crc") on node "crc" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.801809 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage03-crc" (UniqueName: "kubernetes.io/local-volume/local-storage03-crc") on node "crc" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.809800 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-config-data" (OuterVolumeSpecName: "config-data") pod "83b00f57-a375-47cd-ae26-86353c3ccf61" (UID: "83b00f57-a375-47cd-ae26-86353c3ccf61"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876247 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876281 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage03-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage03-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876292 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qzh8f\" (UniqueName: \"kubernetes.io/projected/83b00f57-a375-47cd-ae26-86353c3ccf61-kube-api-access-qzh8f\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876301 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876311 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/83b00f57-a375-47cd-ae26-86353c3ccf61-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876332 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" " Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.876343 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage06-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage06-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.889190 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage13-crc" (UniqueName: "kubernetes.io/local-volume/local-storage13-crc") on node "crc" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.900555 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage18-crc" (UniqueName: "kubernetes.io/local-volume/local-storage18-crc") on node "crc" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.978217 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage18-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage18-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:56 crc kubenswrapper[4812]: I0131 04:51:56.978577 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage13-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage13-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.414048 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-1" event={"ID":"83b00f57-a375-47cd-ae26-86353c3ccf61","Type":"ContainerDied","Data":"0657f0a1f09551d114964b1bd6a59181475a89012952621141e314d7281f0548"} Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.414288 4812 scope.go:117] "RemoveContainer" containerID="a7f7035d5ffcca63f5f6e955417650aa325575108fa642738e59e4246359dc7b" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.415002 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-1" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.418203 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-1" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.426192 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-1" event={"ID":"61306405-d45a-4632-b239-6feec523a983","Type":"ContainerDied","Data":"7c14fd01d218de2ecd47412731a4fd1f85ab17e5617461fc0efe0e5f11bde4d4"} Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.443373 4812 scope.go:117] "RemoveContainer" containerID="76f04d53de11321b69821bfacf8584ebc9fcbf701e16c36b26ad3627885d0549" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.466105 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.472449 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-1"] Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.479934 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.483364 4812 scope.go:117] "RemoveContainer" containerID="5407ec7c7fc3bcb6599983e4e650f15b7c769ae3126e25b64f2b25d2a92de51b" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.485692 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-1"] Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.518197 4812 scope.go:117] "RemoveContainer" containerID="d54859e0fbe35dc69c2ce3fea3e550f34245238ca8e3ce52913295a2e134cedc" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.598149 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:35466->10.217.0.148:9292: read: connection reset by peer" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.598151 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-external-api-0" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.148:9292/healthcheck\": read tcp 10.217.0.2:35456->10.217.0.148:9292: read: connection reset by peer" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.695566 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-httpd" probeResult="failure" output="Get \"http://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:33260->10.217.0.149:9292: read: connection reset by peer" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.695605 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="glance-kuttl-tests/glance-default-internal-api-0" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-log" probeResult="failure" output="Get \"http://10.217.0.149:9292/healthcheck\": read tcp 10.217.0.2:33272->10.217.0.149:9292: read: connection reset by peer" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.886662 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:57 crc kubenswrapper[4812]: I0131 04:51:57.975713 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.008076 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vhzn4\" (UniqueName: \"kubernetes.io/projected/100166d1-b18c-4f2c-80e1-3dabfad2b999-kube-api-access-vhzn4\") pod \"100166d1-b18c-4f2c-80e1-3dabfad2b999\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.008298 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100166d1-b18c-4f2c-80e1-3dabfad2b999-operator-scripts\") pod \"100166d1-b18c-4f2c-80e1-3dabfad2b999\" (UID: \"100166d1-b18c-4f2c-80e1-3dabfad2b999\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.009577 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/100166d1-b18c-4f2c-80e1-3dabfad2b999-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "100166d1-b18c-4f2c-80e1-3dabfad2b999" (UID: "100166d1-b18c-4f2c-80e1-3dabfad2b999"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.020275 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/100166d1-b18c-4f2c-80e1-3dabfad2b999-kube-api-access-vhzn4" (OuterVolumeSpecName: "kube-api-access-vhzn4") pod "100166d1-b18c-4f2c-80e1-3dabfad2b999" (UID: "100166d1-b18c-4f2c-80e1-3dabfad2b999"). InnerVolumeSpecName "kube-api-access-vhzn4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.022926 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.109257 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.109570 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-run\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.109680 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-lib-modules\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.109675 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-run" (OuterVolumeSpecName: "run") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.109952 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.109764 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.110115 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-var-locks-brick\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.110234 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-logs\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.110672 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-logs" (OuterVolumeSpecName: "logs") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111567 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-szkx4\" (UniqueName: \"kubernetes.io/projected/2c14c489-8003-4c2e-a48f-c0ce57855f30-kube-api-access-szkx4\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111622 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-dev\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111653 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-config-data\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111729 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-httpd-run\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111779 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-nvme\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111796 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-sys\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111811 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-dev" (OuterVolumeSpecName: "dev") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111851 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111856 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111882 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-iscsi\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111906 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-scripts\") pod \"2c14c489-8003-4c2e-a48f-c0ce57855f30\" (UID: \"2c14c489-8003-4c2e-a48f-c0ce57855f30\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111900 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-sys" (OuterVolumeSpecName: "sys") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.111957 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112155 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112396 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage01-crc" (OuterVolumeSpecName: "glance-cache") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "local-storage01-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112459 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112480 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112492 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112502 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vhzn4\" (UniqueName: \"kubernetes.io/projected/100166d1-b18c-4f2c-80e1-3dabfad2b999-kube-api-access-vhzn4\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112511 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112519 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/2c14c489-8003-4c2e-a48f-c0ce57855f30-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112527 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112535 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112542 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/100166d1-b18c-4f2c-80e1-3dabfad2b999-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112551 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.112559 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/2c14c489-8003-4c2e-a48f-c0ce57855f30-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.114419 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2c14c489-8003-4c2e-a48f-c0ce57855f30-kube-api-access-szkx4" (OuterVolumeSpecName: "kube-api-access-szkx4") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "kube-api-access-szkx4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.115059 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-scripts" (OuterVolumeSpecName: "scripts") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.115188 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage05-crc" (OuterVolumeSpecName: "glance") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "local-storage05-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.153527 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-config-data" (OuterVolumeSpecName: "config-data") pod "2c14c489-8003-4c2e-a48f-c0ce57855f30" (UID: "2c14c489-8003-4c2e-a48f-c0ce57855f30"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213125 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-logs\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213179 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-var-locks-brick\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213203 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-httpd-run\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213216 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-run\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213230 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-lib-modules\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213257 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-s56f8\" (UniqueName: \"kubernetes.io/projected/c3f36241-e6de-4a61-8f71-5334a7e079f4-kube-api-access-s56f8\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213282 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213285 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-var-locks-brick" (OuterVolumeSpecName: "var-locks-brick") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "var-locks-brick". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213328 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-scripts\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213349 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"glance-cache\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213369 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-dev\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213368 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-run" (OuterVolumeSpecName: "run") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "run". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213400 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-sys\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213461 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-sys" (OuterVolumeSpecName: "sys") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "sys". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213509 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-nvme\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213610 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-iscsi\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213658 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-httpd-run" (OuterVolumeSpecName: "httpd-run") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "httpd-run". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213721 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-config-data\") pod \"c3f36241-e6de-4a61-8f71-5334a7e079f4\" (UID: \"c3f36241-e6de-4a61-8f71-5334a7e079f4\") " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213728 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-lib-modules" (OuterVolumeSpecName: "lib-modules") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "lib-modules". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213761 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-nvme" (OuterVolumeSpecName: "etc-nvme") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "etc-nvme". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213787 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-dev" (OuterVolumeSpecName: "dev") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "dev". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213788 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-logs" (OuterVolumeSpecName: "logs") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "logs". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.213814 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-iscsi" (OuterVolumeSpecName: "etc-iscsi") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "etc-iscsi". PluginName "kubernetes.io/host-path", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214404 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-szkx4\" (UniqueName: \"kubernetes.io/projected/2c14c489-8003-4c2e-a48f-c0ce57855f30-kube-api-access-szkx4\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214443 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214464 4812 reconciler_common.go:293] "Volume detached for volume \"logs\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-logs\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214484 4812 reconciler_common.go:293] "Volume detached for volume \"var-locks-brick\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-var-locks-brick\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214501 4812 reconciler_common.go:293] "Volume detached for volume \"httpd-run\" (UniqueName: \"kubernetes.io/empty-dir/c3f36241-e6de-4a61-8f71-5334a7e079f4-httpd-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214517 4812 reconciler_common.go:293] "Volume detached for volume \"run\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-run\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214534 4812 reconciler_common.go:293] "Volume detached for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-lib-modules\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214550 4812 reconciler_common.go:293] "Volume detached for volume \"dev\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-dev\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214566 4812 reconciler_common.go:293] "Volume detached for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-sys\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214603 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214621 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/2c14c489-8003-4c2e-a48f-c0ce57855f30-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214638 4812 reconciler_common.go:293] "Volume detached for volume \"etc-nvme\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-nvme\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214663 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.214681 4812 reconciler_common.go:293] "Volume detached for volume \"etc-iscsi\" (UniqueName: \"kubernetes.io/host-path/c3f36241-e6de-4a61-8f71-5334a7e079f4-etc-iscsi\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.216468 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage14-crc" (OuterVolumeSpecName: "glance") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "local-storage14-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.217252 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3f36241-e6de-4a61-8f71-5334a7e079f4-kube-api-access-s56f8" (OuterVolumeSpecName: "kube-api-access-s56f8") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "kube-api-access-s56f8". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.217664 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage19-crc" (OuterVolumeSpecName: "glance-cache") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "local-storage19-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.218371 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-scripts" (OuterVolumeSpecName: "scripts") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.237344 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage05-crc" (UniqueName: "kubernetes.io/local-volume/local-storage05-crc") on node "crc" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.246551 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage01-crc" (UniqueName: "kubernetes.io/local-volume/local-storage01-crc") on node "crc" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.274976 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-config-data" (OuterVolumeSpecName: "config-data") pod "c3f36241-e6de-4a61-8f71-5334a7e079f4" (UID: "c3f36241-e6de-4a61-8f71-5334a7e079f4"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317513 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage01-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage01-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317579 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317600 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-s56f8\" (UniqueName: \"kubernetes.io/projected/c3f36241-e6de-4a61-8f71-5334a7e079f4-kube-api-access-s56f8\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317654 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317680 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" " Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317701 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/c3f36241-e6de-4a61-8f71-5334a7e079f4-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.317718 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage05-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage05-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.338420 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage19-crc" (UniqueName: "kubernetes.io/local-volume/local-storage19-crc") on node "crc" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.344267 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage14-crc" (UniqueName: "kubernetes.io/local-volume/local-storage14-crc") on node "crc" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.353823 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="61306405-d45a-4632-b239-6feec523a983" path="/var/lib/kubelet/pods/61306405-d45a-4632-b239-6feec523a983/volumes" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.355255 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" path="/var/lib/kubelet/pods/83b00f57-a375-47cd-ae26-86353c3ccf61/volumes" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.419960 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage14-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage14-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.420045 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage19-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage19-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.444345 4812 generic.go:334] "Generic (PLEG): container finished" podID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerID="8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee" exitCode=0 Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.444402 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"2c14c489-8003-4c2e-a48f-c0ce57855f30","Type":"ContainerDied","Data":"8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee"} Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.444428 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-external-api-0" event={"ID":"2c14c489-8003-4c2e-a48f-c0ce57855f30","Type":"ContainerDied","Data":"cbc658d76c0c8e85c44de8c5578b9617bf6d1ad94ba072b366d2f393a5cedc97"} Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.444446 4812 scope.go:117] "RemoveContainer" containerID="8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.444561 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-external-api-0" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.449635 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" event={"ID":"100166d1-b18c-4f2c-80e1-3dabfad2b999","Type":"ContainerDied","Data":"0cdaf00e817f7d34df8375089b25e89c51044b502934081eb8d4b1467f091c53"} Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.449665 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glancea2e9-account-delete-jt4d7" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.449690 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="0cdaf00e817f7d34df8375089b25e89c51044b502934081eb8d4b1467f091c53" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.454338 4812 generic.go:334] "Generic (PLEG): container finished" podID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerID="cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94" exitCode=0 Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.454381 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c3f36241-e6de-4a61-8f71-5334a7e079f4","Type":"ContainerDied","Data":"cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94"} Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.454390 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/glance-default-internal-api-0" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.454399 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/glance-default-internal-api-0" event={"ID":"c3f36241-e6de-4a61-8f71-5334a7e079f4","Type":"ContainerDied","Data":"b7463e5229035dc905e8e6500a7e3c5dd6ccc0271a1cd2e15e4e0ec35262810a"} Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.471701 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.485289 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-external-api-0"] Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.486419 4812 scope.go:117] "RemoveContainer" containerID="51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.491948 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.497420 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-default-internal-api-0"] Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.502692 4812 scope.go:117] "RemoveContainer" containerID="8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee" Jan 31 04:51:58 crc kubenswrapper[4812]: E0131 04:51:58.503089 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee\": container with ID starting with 8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee not found: ID does not exist" containerID="8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.503148 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee"} err="failed to get container status \"8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee\": rpc error: code = NotFound desc = could not find container \"8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee\": container with ID starting with 8b1e6191d642d73b066fa1742577a1fc43ca69ab224a01e93f6a714f2c3902ee not found: ID does not exist" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.503172 4812 scope.go:117] "RemoveContainer" containerID="51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612" Jan 31 04:51:58 crc kubenswrapper[4812]: E0131 04:51:58.503425 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612\": container with ID starting with 51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612 not found: ID does not exist" containerID="51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.503457 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612"} err="failed to get container status \"51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612\": rpc error: code = NotFound desc = could not find container \"51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612\": container with ID starting with 51253456d67788316bbddadff3b9ab9da57d1f6e462adbfcf9f59188e777e612 not found: ID does not exist" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.503480 4812 scope.go:117] "RemoveContainer" containerID="cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.522732 4812 scope.go:117] "RemoveContainer" containerID="6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.539265 4812 scope.go:117] "RemoveContainer" containerID="cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94" Jan 31 04:51:58 crc kubenswrapper[4812]: E0131 04:51:58.539704 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94\": container with ID starting with cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94 not found: ID does not exist" containerID="cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.539738 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94"} err="failed to get container status \"cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94\": rpc error: code = NotFound desc = could not find container \"cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94\": container with ID starting with cdba5c609eb06dbbd3c92ef41165efcac4c9622639ec27ef5dc1fc04b78cfc94 not found: ID does not exist" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.539765 4812 scope.go:117] "RemoveContainer" containerID="6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6" Jan 31 04:51:58 crc kubenswrapper[4812]: E0131 04:51:58.540040 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6\": container with ID starting with 6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6 not found: ID does not exist" containerID="6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6" Jan 31 04:51:58 crc kubenswrapper[4812]: I0131 04:51:58.540086 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6"} err="failed to get container status \"6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6\": rpc error: code = NotFound desc = could not find container \"6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6\": container with ID starting with 6d4bfcc2c6c95ca7279415fcdcb489213f17df3849a71b522f8491607bf754c6 not found: ID does not exist" Jan 31 04:51:59 crc kubenswrapper[4812]: I0131 04:51:59.403350 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-db-create-thhl4"] Jan 31 04:51:59 crc kubenswrapper[4812]: I0131 04:51:59.412998 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-db-create-thhl4"] Jan 31 04:51:59 crc kubenswrapper[4812]: I0131 04:51:59.423286 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glancea2e9-account-delete-jt4d7"] Jan 31 04:51:59 crc kubenswrapper[4812]: I0131 04:51:59.430359 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/glance-a2e9-account-create-update-m4fls"] Jan 31 04:51:59 crc kubenswrapper[4812]: I0131 04:51:59.439060 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glance-a2e9-account-create-update-m4fls"] Jan 31 04:51:59 crc kubenswrapper[4812]: I0131 04:51:59.445909 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/glancea2e9-account-delete-jt4d7"] Jan 31 04:52:00 crc kubenswrapper[4812]: I0131 04:52:00.349624 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="100166d1-b18c-4f2c-80e1-3dabfad2b999" path="/var/lib/kubelet/pods/100166d1-b18c-4f2c-80e1-3dabfad2b999/volumes" Jan 31 04:52:00 crc kubenswrapper[4812]: I0131 04:52:00.350222 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" path="/var/lib/kubelet/pods/2c14c489-8003-4c2e-a48f-c0ce57855f30/volumes" Jan 31 04:52:00 crc kubenswrapper[4812]: I0131 04:52:00.350773 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6ae1265c-41c1-4871-998d-fd3f17598b8e" path="/var/lib/kubelet/pods/6ae1265c-41c1-4871-998d-fd3f17598b8e/volumes" Jan 31 04:52:00 crc kubenswrapper[4812]: I0131 04:52:00.352125 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bf00dafa-494f-411d-9155-4bc61d08b46b" path="/var/lib/kubelet/pods/bf00dafa-494f-411d-9155-4bc61d08b46b/volumes" Jan 31 04:52:00 crc kubenswrapper[4812]: I0131 04:52:00.352757 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" path="/var/lib/kubelet/pods/c3f36241-e6de-4a61-8f71-5334a7e079f4/volumes" Jan 31 04:52:02 crc kubenswrapper[4812]: E0131 04:52:02.924640 4812 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:33192->38.102.83.238:34831: write tcp 38.102.83.238:33192->38.102.83.238:34831: write: broken pipe Jan 31 04:52:06 crc kubenswrapper[4812]: E0131 04:52:06.071492 4812 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:33312->38.102.83.238:34831: write tcp 38.102.83.238:33312->38.102.83.238:34831: write: broken pipe Jan 31 04:52:07 crc kubenswrapper[4812]: E0131 04:52:07.186936 4812 upgradeaware.go:427] Error proxying data from client to backend: readfrom tcp 38.102.83.238:47752->38.102.83.238:34831: write tcp 38.102.83.238:47752->38.102.83.238:34831: write: broken pipe Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.437086 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-mw6hz"] Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.455805 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.456426 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-server" containerID="cri-o://6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.456822 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="swift-recon-cron" containerID="cri-o://54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.456932 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="rsync" containerID="cri-o://9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.456975 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-expirer" containerID="cri-o://055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457026 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-updater" containerID="cri-o://4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457062 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-auditor" containerID="cri-o://8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457104 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-replicator" containerID="cri-o://db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457142 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-server" containerID="cri-o://fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457184 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-updater" containerID="cri-o://62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457227 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-auditor" containerID="cri-o://3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457268 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-replicator" containerID="cri-o://552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457306 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-server" containerID="cri-o://46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457345 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-reaper" containerID="cri-o://7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457386 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-auditor" containerID="cri-o://d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.457456 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-storage-0" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-replicator" containerID="cri-o://cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.458826 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-ring-rebalance-mw6hz"] Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.490242 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4"] Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.490483 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-httpd" containerID="cri-o://9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0" gracePeriod=30 Jan 31 04:52:07 crc kubenswrapper[4812]: I0131 04:52:07.490976 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-server" containerID="cri-o://5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07" gracePeriod=30 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.225815 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.348448 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="55388d4d-80ca-4221-9a99-37c116dda83e" path="/var/lib/kubelet/pods/55388d4d-80ca-4221-9a99-37c116dda83e/volumes" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.354650 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-run-httpd\") pod \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.354716 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-92f4g\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-kube-api-access-92f4g\") pod \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.354870 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") pod \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.354916 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-config-data\") pod \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.354968 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-log-httpd\") pod \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\" (UID: \"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae\") " Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.355371 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-run-httpd" (OuterVolumeSpecName: "run-httpd") pod "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae"). InnerVolumeSpecName "run-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.355458 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-log-httpd" (OuterVolumeSpecName: "log-httpd") pod "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae"). InnerVolumeSpecName "log-httpd". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.360888 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.361582 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-kube-api-access-92f4g" (OuterVolumeSpecName: "kube-api-access-92f4g") pod "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae"). InnerVolumeSpecName "kube-api-access-92f4g". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.403600 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-config-data" (OuterVolumeSpecName: "config-data") pod "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" (UID: "2b92b5ab-106c-4ea9-a9f3-461ec016b1ae"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.456672 4812 reconciler_common.go:293] "Volume detached for volume \"run-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-run-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.456865 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-92f4g\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-kube-api-access-92f4g\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.457041 4812 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.457077 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.457090 4812 reconciler_common.go:293] "Volume detached for volume \"log-httpd\" (UniqueName: \"kubernetes.io/empty-dir/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae-log-httpd\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552681 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552718 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552728 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552737 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552746 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552754 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552764 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552773 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552783 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552791 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552799 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552808 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552818 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552827 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552908 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552940 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552955 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552967 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552979 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.552991 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553006 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553017 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553030 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553043 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553055 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553066 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553096 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.553108 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.554919 4812 generic.go:334] "Generic (PLEG): container finished" podID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerID="5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.554948 4812 generic.go:334] "Generic (PLEG): container finished" podID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerID="9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0" exitCode=0 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.554968 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" event={"ID":"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae","Type":"ContainerDied","Data":"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.554988 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" event={"ID":"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae","Type":"ContainerDied","Data":"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.555002 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" event={"ID":"2b92b5ab-106c-4ea9-a9f3-461ec016b1ae","Type":"ContainerDied","Data":"5e4e3ef6b9f40718d69513cc65dc753d997aa99542746b2a201399f5c66869cc"} Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.555020 4812 scope.go:117] "RemoveContainer" containerID="5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.555164 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.579788 4812 scope.go:117] "RemoveContainer" containerID="9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.652042 4812 scope.go:117] "RemoveContainer" containerID="5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.652644 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07\": container with ID starting with 5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07 not found: ID does not exist" containerID="5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.652697 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07"} err="failed to get container status \"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07\": rpc error: code = NotFound desc = could not find container \"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07\": container with ID starting with 5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07 not found: ID does not exist" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.652732 4812 scope.go:117] "RemoveContainer" containerID="9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.653147 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0\": container with ID starting with 9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0 not found: ID does not exist" containerID="9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.653214 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0"} err="failed to get container status \"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0\": rpc error: code = NotFound desc = could not find container \"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0\": container with ID starting with 9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0 not found: ID does not exist" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.653260 4812 scope.go:117] "RemoveContainer" containerID="5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.653565 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07"} err="failed to get container status \"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07\": rpc error: code = NotFound desc = could not find container \"5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07\": container with ID starting with 5394e5f63cc54a4e00dc9311fd2b7fd3b5c66cafcb06b4b998ddb558c9378d07 not found: ID does not exist" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.653630 4812 scope.go:117] "RemoveContainer" containerID="9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.654172 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0"} err="failed to get container status \"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0\": rpc error: code = NotFound desc = could not find container \"9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0\": container with ID starting with 9171fb30e22a7f8b44db1794a7cafd8d3858d88eb9a890bc5ec372d2dc5658f0 not found: ID does not exist" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.655645 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.662767 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-proxy-6d699db77c-g7fm4"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.898186 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-pdfkq"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.909620 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-czvn5"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.919346 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-5c49cbbfd-wfwms"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.919669 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" podUID="949a611c-00dc-4dac-9068-0dc00cf79572" containerName="keystone-api" containerID="cri-o://453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab" gracePeriod=30 Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.928258 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-sync-pdfkq"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.946980 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-bootstrap-czvn5"] Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951185 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/keystone0ca8-account-delete-nmrnh"] Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951467 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951478 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951488 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951494 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951509 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-server" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951515 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-server" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951525 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951531 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951538 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951544 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951555 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="100166d1-b18c-4f2c-80e1-3dabfad2b999" containerName="mariadb-account-delete" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951561 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="100166d1-b18c-4f2c-80e1-3dabfad2b999" containerName="mariadb-account-delete" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951568 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951573 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951586 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951591 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951604 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951609 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951618 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951624 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: E0131 04:52:08.951635 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951640 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951749 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951758 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951771 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951779 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="2c14c489-8003-4c2e-a48f-c0ce57855f30" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951788 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951797 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951804 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c3f36241-e6de-4a61-8f71-5334a7e079f4" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951811 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="61306405-d45a-4632-b239-6feec523a983" containerName="glance-log" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951820 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="100166d1-b18c-4f2c-80e1-3dabfad2b999" containerName="mariadb-account-delete" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951827 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" containerName="proxy-server" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.951852 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="83b00f57-a375-47cd-ae26-86353c3ccf61" containerName="glance-httpd" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.952273 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:08 crc kubenswrapper[4812]: I0131 04:52:08.954685 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone0ca8-account-delete-nmrnh"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.064672 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jg28q\" (UniqueName: \"kubernetes.io/projected/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-kube-api-access-jg28q\") pod \"keystone0ca8-account-delete-nmrnh\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.064806 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts\") pod \"keystone0ca8-account-delete-nmrnh\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.166768 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts\") pod \"keystone0ca8-account-delete-nmrnh\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.166939 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jg28q\" (UniqueName: \"kubernetes.io/projected/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-kube-api-access-jg28q\") pod \"keystone0ca8-account-delete-nmrnh\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.167631 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts\") pod \"keystone0ca8-account-delete-nmrnh\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.191139 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-jg28q\" (UniqueName: \"kubernetes.io/projected/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-kube-api-access-jg28q\") pod \"keystone0ca8-account-delete-nmrnh\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.268762 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.541462 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ztrnl"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.559938 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-ztrnl"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.570104 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["glance-kuttl-tests/root-account-create-update-4jqjn"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.571263 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.576717 4812 reflector.go:368] Caches populated for *v1.Secret from object-"glance-kuttl-tests"/"openstack-mariadb-root-db-secret" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.581730 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4jqjn"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.600421 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.610062 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.617783 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.630862 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4jqjn"] Jan 31 04:52:09 crc kubenswrapper[4812]: E0131 04:52:09.631467 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[kube-api-access-jxb8s operator-scripts], unattached volumes=[], failed to process volumes=[]: context canceled" pod="glance-kuttl-tests/root-account-create-update-4jqjn" podUID="7835f82a-699e-43b6-894d-8daf78e59d03" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.673734 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jxb8s\" (UniqueName: \"kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.674016 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.704631 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/keystone0ca8-account-delete-nmrnh"] Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.758198 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-2" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="galera" containerID="cri-o://b698949a0a69c7dc614f1c6150a7e3959a0c7af53a41006aaa9bd24009590a46" gracePeriod=30 Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.774817 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxb8s\" (UniqueName: \"kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:09 crc kubenswrapper[4812]: I0131 04:52:09.774892 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:09 crc kubenswrapper[4812]: E0131 04:52:09.775024 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:09 crc kubenswrapper[4812]: E0131 04:52:09.775084 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts podName:7835f82a-699e-43b6-894d-8daf78e59d03 nodeName:}" failed. No retries permitted until 2026-01-31 04:52:10.275068261 +0000 UTC m=+1538.770089926 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts") pod "root-account-create-update-4jqjn" (UID: "7835f82a-699e-43b6-894d-8daf78e59d03") : configmap "openstack-scripts" not found Jan 31 04:52:09 crc kubenswrapper[4812]: E0131 04:52:09.782586 4812 projected.go:194] Error preparing data for projected volume kube-api-access-jxb8s for pod glance-kuttl-tests/root-account-create-update-4jqjn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 04:52:09 crc kubenswrapper[4812]: E0131 04:52:09.782668 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s podName:7835f82a-699e-43b6-894d-8daf78e59d03 nodeName:}" failed. No retries permitted until 2026-01-31 04:52:10.282637495 +0000 UTC m=+1538.777659150 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-jxb8s" (UniqueName: "kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s") pod "root-account-create-update-4jqjn" (UID: "7835f82a-699e-43b6-894d-8daf78e59d03") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.188934 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.189611 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/memcached-0" podUID="014dcd69-d412-4c77-8a96-521ffc036f50" containerName="memcached" containerID="cri-o://7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce" gracePeriod=30 Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.282121 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:10 crc kubenswrapper[4812]: E0131 04:52:10.282229 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:10 crc kubenswrapper[4812]: E0131 04:52:10.282313 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts podName:7835f82a-699e-43b6-894d-8daf78e59d03 nodeName:}" failed. No retries permitted until 2026-01-31 04:52:11.282292412 +0000 UTC m=+1539.777314087 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts") pod "root-account-create-update-4jqjn" (UID: "7835f82a-699e-43b6-894d-8daf78e59d03") : configmap "openstack-scripts" not found Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.351251 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2b92b5ab-106c-4ea9-a9f3-461ec016b1ae" path="/var/lib/kubelet/pods/2b92b5ab-106c-4ea9-a9f3-461ec016b1ae/volumes" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.352942 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="718bb9c9-540b-4aae-ae69-fff60f342873" path="/var/lib/kubelet/pods/718bb9c9-540b-4aae-ae69-fff60f342873/volumes" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.354271 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a33f3025-e1cb-4935-b358-3badf3fda335" path="/var/lib/kubelet/pods/a33f3025-e1cb-4935-b358-3badf3fda335/volumes" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.363652 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c5b8de02-d057-468f-9df6-b18f83bc6dbe" path="/var/lib/kubelet/pods/c5b8de02-d057-468f-9df6-b18f83bc6dbe/volumes" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.383389 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxb8s\" (UniqueName: \"kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:10 crc kubenswrapper[4812]: E0131 04:52:10.387055 4812 projected.go:194] Error preparing data for projected volume kube-api-access-jxb8s for pod glance-kuttl-tests/root-account-create-update-4jqjn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 04:52:10 crc kubenswrapper[4812]: E0131 04:52:10.387143 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s podName:7835f82a-699e-43b6-894d-8daf78e59d03 nodeName:}" failed. No retries permitted until 2026-01-31 04:52:11.387122045 +0000 UTC m=+1539.882143720 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxb8s" (UniqueName: "kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s") pod "root-account-create-update-4jqjn" (UID: "7835f82a-699e-43b6-894d-8daf78e59d03") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.572888 4812 generic.go:334] "Generic (PLEG): container finished" podID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerID="b698949a0a69c7dc614f1c6150a7e3959a0c7af53a41006aaa9bd24009590a46" exitCode=0 Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.572960 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"546d2f27-dfde-4446-978b-19b2e6a1d6a0","Type":"ContainerDied","Data":"b698949a0a69c7dc614f1c6150a7e3959a0c7af53a41006aaa9bd24009590a46"} Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.572991 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-2" event={"ID":"546d2f27-dfde-4446-978b-19b2e6a1d6a0","Type":"ContainerDied","Data":"a7d02b1f60921b56dfabadefeec3379818b71014bfb53d056d0a41e39426ef72"} Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.573003 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="a7d02b1f60921b56dfabadefeec3379818b71014bfb53d056d0a41e39426ef72" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.574230 4812 generic.go:334] "Generic (PLEG): container finished" podID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerID="57d40e84f61515f45f041dc6a75dcce9c8d1851d79c704fdf16d5a156e2d1285" exitCode=1 Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.574312 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.574998 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" event={"ID":"4f0da6d0-0d0b-40b3-a3f7-90470b94115b","Type":"ContainerDied","Data":"57d40e84f61515f45f041dc6a75dcce9c8d1851d79c704fdf16d5a156e2d1285"} Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.575056 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" event={"ID":"4f0da6d0-0d0b-40b3-a3f7-90470b94115b","Type":"ContainerStarted","Data":"1560edc753f5e8bf643d9b4d7ca612c740697ba191b460ae36062d04bf846de0"} Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.575118 4812 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" secret="" err="secret \"galera-openstack-dockercfg-gtjhm\" not found" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.575160 4812 scope.go:117] "RemoveContainer" containerID="57d40e84f61515f45f041dc6a75dcce9c8d1851d79c704fdf16d5a156e2d1285" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.640739 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.680499 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.683399 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:10 crc kubenswrapper[4812]: E0131 04:52:10.686916 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:10 crc kubenswrapper[4812]: E0131 04:52:10.686973 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts podName:4f0da6d0-0d0b-40b3-a3f7-90470b94115b nodeName:}" failed. No retries permitted until 2026-01-31 04:52:11.186958303 +0000 UTC m=+1539.681979978 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts") pod "keystone0ca8-account-delete-nmrnh" (UID: "4f0da6d0-0d0b-40b3-a3f7-90470b94115b") : configmap "openstack-scripts" not found Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.787909 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xh94m\" (UniqueName: \"kubernetes.io/projected/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kube-api-access-xh94m\") pod \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788432 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-default\") pod \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788466 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-generated\") pod \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788490 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") pod \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788515 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kolla-config\") pod \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788588 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts\") pod \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\" (UID: \"546d2f27-dfde-4446-978b-19b2e6a1d6a0\") " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788867 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "546d2f27-dfde-4446-978b-19b2e6a1d6a0" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.788927 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "546d2f27-dfde-4446-978b-19b2e6a1d6a0" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.789224 4812 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.789248 4812 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/546d2f27-dfde-4446-978b-19b2e6a1d6a0-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.789404 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "546d2f27-dfde-4446-978b-19b2e6a1d6a0" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.791031 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "546d2f27-dfde-4446-978b-19b2e6a1d6a0" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.794486 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kube-api-access-xh94m" (OuterVolumeSpecName: "kube-api-access-xh94m") pod "546d2f27-dfde-4446-978b-19b2e6a1d6a0" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0"). InnerVolumeSpecName "kube-api-access-xh94m". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.800611 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage16-crc" (OuterVolumeSpecName: "mysql-db") pod "546d2f27-dfde-4446-978b-19b2e6a1d6a0" (UID: "546d2f27-dfde-4446-978b-19b2e6a1d6a0"). InnerVolumeSpecName "local-storage16-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.890744 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xh94m\" (UniqueName: \"kubernetes.io/projected/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kube-api-access-xh94m\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.891041 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" " Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.891109 4812 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.891166 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/546d2f27-dfde-4446-978b-19b2e6a1d6a0-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.903141 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage16-crc" (UniqueName: "kubernetes.io/local-volume/local-storage16-crc") on node "crc" Jan 31 04:52:10 crc kubenswrapper[4812]: I0131 04:52:10.993871 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage16-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage16-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.064770 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.195763 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.195861 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts podName:4f0da6d0-0d0b-40b3-a3f7-90470b94115b nodeName:}" failed. No retries permitted until 2026-01-31 04:52:12.195819396 +0000 UTC m=+1540.690841061 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts") pod "keystone0ca8-account-delete-nmrnh" (UID: "4f0da6d0-0d0b-40b3-a3f7-90470b94115b") : configmap "openstack-scripts" not found Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.296556 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.296719 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.296776 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts podName:7835f82a-699e-43b6-894d-8daf78e59d03 nodeName:}" failed. No retries permitted until 2026-01-31 04:52:13.296760882 +0000 UTC m=+1541.791782547 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts") pod "root-account-create-update-4jqjn" (UID: "7835f82a-699e-43b6-894d-8daf78e59d03") : configmap "openstack-scripts" not found Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.398433 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-jxb8s\" (UniqueName: \"kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s\") pod \"root-account-create-update-4jqjn\" (UID: \"7835f82a-699e-43b6-894d-8daf78e59d03\") " pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.402573 4812 projected.go:194] Error preparing data for projected volume kube-api-access-jxb8s for pod glance-kuttl-tests/root-account-create-update-4jqjn: failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.402659 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s podName:7835f82a-699e-43b6-894d-8daf78e59d03 nodeName:}" failed. No retries permitted until 2026-01-31 04:52:13.402635438 +0000 UTC m=+1541.897657123 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-jxb8s" (UniqueName: "kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s") pod "root-account-create-update-4jqjn" (UID: "7835f82a-699e-43b6-894d-8daf78e59d03") : failed to fetch token: serviceaccounts "galera-openstack" not found Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.587577 4812 generic.go:334] "Generic (PLEG): container finished" podID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerID="19a8ce328eab525a9058852edc6ae8c6ad438e3de4bcc13a5b77bfe48cb2934b" exitCode=1 Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.587652 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-2" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.587673 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" event={"ID":"4f0da6d0-0d0b-40b3-a3f7-90470b94115b","Type":"ContainerDied","Data":"19a8ce328eab525a9058852edc6ae8c6ad438e3de4bcc13a5b77bfe48cb2934b"} Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.587735 4812 scope.go:117] "RemoveContainer" containerID="57d40e84f61515f45f041dc6a75dcce9c8d1851d79c704fdf16d5a156e2d1285" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.587863 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/root-account-create-update-4jqjn" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.588256 4812 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" secret="" err="secret \"galera-openstack-dockercfg-gtjhm\" not found" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.588311 4812 scope.go:117] "RemoveContainer" containerID="19a8ce328eab525a9058852edc6ae8c6ad438e3de4bcc13a5b77bfe48cb2934b" Jan 31 04:52:11 crc kubenswrapper[4812]: E0131 04:52:11.588666 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystone0ca8-account-delete-nmrnh_glance-kuttl-tests(4f0da6d0-0d0b-40b3-a3f7-90470b94115b)\"" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.647198 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/rabbitmq-server-0" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerName="rabbitmq" containerID="cri-o://086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d" gracePeriod=604800 Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.696796 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4jqjn"] Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.716952 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/root-account-create-update-4jqjn"] Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.724902 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.734885 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-2"] Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.807004 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7835f82a-699e-43b6-894d-8daf78e59d03-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.807038 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jxb8s\" (UniqueName: \"kubernetes.io/projected/7835f82a-699e-43b6-894d-8daf78e59d03-kube-api-access-jxb8s\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.869817 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-1" podUID="efd185c8-37f1-4661-b631-524671bff15f" containerName="galera" containerID="cri-o://5b262d6b7da31fc5a2b2f4365848ada534c8499fe6dece82db9aa3201448502c" gracePeriod=28 Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.875148 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6"] Jan 31 04:52:11 crc kubenswrapper[4812]: I0131 04:52:11.875335 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" podUID="99392b95-2df1-4600-afd1-c6a4f4d47e5c" containerName="manager" containerID="cri-o://e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c" gracePeriod=10 Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.105704 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-gqrtf"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.105947 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/glance-operator-index-gqrtf" podUID="a84ac0d3-4f66-44f0-8566-37dd9a31bb66" containerName="registry-server" containerID="cri-o://7ab6ed6bbc496845b23c712bc1de59a49f4d382adcab786248a95bb9c417d985" gracePeriod=30 Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.169093 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.175349 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/340cf2c1467e087784341680c98dd7cf12e641878deab1e0f394bd1894kfbqc"] Jan 31 04:52:12 crc kubenswrapper[4812]: E0131 04:52:12.214653 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:12 crc kubenswrapper[4812]: E0131 04:52:12.214715 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts podName:4f0da6d0-0d0b-40b3-a3f7-90470b94115b nodeName:}" failed. No retries permitted until 2026-01-31 04:52:14.214701427 +0000 UTC m=+1542.709723092 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts") pod "keystone0ca8-account-delete-nmrnh" (UID: "4f0da6d0-0d0b-40b3-a3f7-90470b94115b") : configmap "openstack-scripts" not found Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.297084 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.363663 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d4dc508-c84d-4128-a69c-9bddda59bb41" path="/var/lib/kubelet/pods/1d4dc508-c84d-4128-a69c-9bddda59bb41/volumes" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.364907 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" path="/var/lib/kubelet/pods/546d2f27-dfde-4446-978b-19b2e6a1d6a0/volumes" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.365300 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7835f82a-699e-43b6-894d-8daf78e59d03" path="/var/lib/kubelet/pods/7835f82a-699e-43b6-894d-8daf78e59d03/volumes" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.422853 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-kolla-config\") pod \"014dcd69-d412-4c77-8a96-521ffc036f50\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.422920 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-config-data\") pod \"014dcd69-d412-4c77-8a96-521ffc036f50\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.423023 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xt4q2\" (UniqueName: \"kubernetes.io/projected/014dcd69-d412-4c77-8a96-521ffc036f50-kube-api-access-xt4q2\") pod \"014dcd69-d412-4c77-8a96-521ffc036f50\" (UID: \"014dcd69-d412-4c77-8a96-521ffc036f50\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.424013 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "014dcd69-d412-4c77-8a96-521ffc036f50" (UID: "014dcd69-d412-4c77-8a96-521ffc036f50"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.424105 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-config-data" (OuterVolumeSpecName: "config-data") pod "014dcd69-d412-4c77-8a96-521ffc036f50" (UID: "014dcd69-d412-4c77-8a96-521ffc036f50"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.427944 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.428903 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/014dcd69-d412-4c77-8a96-521ffc036f50-kube-api-access-xt4q2" (OuterVolumeSpecName: "kube-api-access-xt4q2") pod "014dcd69-d412-4c77-8a96-521ffc036f50" (UID: "014dcd69-d412-4c77-8a96-521ffc036f50"). InnerVolumeSpecName "kube-api-access-xt4q2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.524683 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vmqtk\" (UniqueName: \"kubernetes.io/projected/99392b95-2df1-4600-afd1-c6a4f4d47e5c-kube-api-access-vmqtk\") pod \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.524730 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-webhook-cert\") pod \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.524757 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-apiservice-cert\") pod \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\" (UID: \"99392b95-2df1-4600-afd1-c6a4f4d47e5c\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.525417 4812 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.525431 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/configmap/014dcd69-d412-4c77-8a96-521ffc036f50-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.525439 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xt4q2\" (UniqueName: \"kubernetes.io/projected/014dcd69-d412-4c77-8a96-521ffc036f50-kube-api-access-xt4q2\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.528655 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "99392b95-2df1-4600-afd1-c6a4f4d47e5c" (UID: "99392b95-2df1-4600-afd1-c6a4f4d47e5c"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.528719 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/99392b95-2df1-4600-afd1-c6a4f4d47e5c-kube-api-access-vmqtk" (OuterVolumeSpecName: "kube-api-access-vmqtk") pod "99392b95-2df1-4600-afd1-c6a4f4d47e5c" (UID: "99392b95-2df1-4600-afd1-c6a4f4d47e5c"). InnerVolumeSpecName "kube-api-access-vmqtk". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.529118 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "99392b95-2df1-4600-afd1-c6a4f4d47e5c" (UID: "99392b95-2df1-4600-afd1-c6a4f4d47e5c"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.535195 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.594886 4812 generic.go:334] "Generic (PLEG): container finished" podID="949a611c-00dc-4dac-9068-0dc00cf79572" containerID="453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab" exitCode=0 Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.594940 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" event={"ID":"949a611c-00dc-4dac-9068-0dc00cf79572","Type":"ContainerDied","Data":"453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.594964 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" event={"ID":"949a611c-00dc-4dac-9068-0dc00cf79572","Type":"ContainerDied","Data":"c42d502265c3eb4d4196536705a7a9ea7e2b842ca955631a95c15552cf960050"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.594982 4812 scope.go:117] "RemoveContainer" containerID="453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.595070 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone-5c49cbbfd-wfwms" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.598130 4812 generic.go:334] "Generic (PLEG): container finished" podID="a84ac0d3-4f66-44f0-8566-37dd9a31bb66" containerID="7ab6ed6bbc496845b23c712bc1de59a49f4d382adcab786248a95bb9c417d985" exitCode=0 Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.598186 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-gqrtf" event={"ID":"a84ac0d3-4f66-44f0-8566-37dd9a31bb66","Type":"ContainerDied","Data":"7ab6ed6bbc496845b23c712bc1de59a49f4d382adcab786248a95bb9c417d985"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.600513 4812 generic.go:334] "Generic (PLEG): container finished" podID="014dcd69-d412-4c77-8a96-521ffc036f50" containerID="7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce" exitCode=0 Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.600577 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/memcached-0" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.600584 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"014dcd69-d412-4c77-8a96-521ffc036f50","Type":"ContainerDied","Data":"7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.600613 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/memcached-0" event={"ID":"014dcd69-d412-4c77-8a96-521ffc036f50","Type":"ContainerDied","Data":"f74de4f57246c2edca36958d666a4a06cc8d5e40c67c651d1ed098a1061a4eb0"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.601779 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.603453 4812 kubelet_pods.go:1007] "Unable to retrieve pull secret, the image pull may not succeed." pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" secret="" err="secret \"galera-openstack-dockercfg-gtjhm\" not found" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.603491 4812 scope.go:117] "RemoveContainer" containerID="19a8ce328eab525a9058852edc6ae8c6ad438e3de4bcc13a5b77bfe48cb2934b" Jan 31 04:52:12 crc kubenswrapper[4812]: E0131 04:52:12.603739 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"mariadb-account-delete\" with CrashLoopBackOff: \"back-off 10s restarting failed container=mariadb-account-delete pod=keystone0ca8-account-delete-nmrnh_glance-kuttl-tests(4f0da6d0-0d0b-40b3-a3f7-90470b94115b)\"" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.605095 4812 generic.go:334] "Generic (PLEG): container finished" podID="99392b95-2df1-4600-afd1-c6a4f4d47e5c" containerID="e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c" exitCode=0 Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.605145 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" event={"ID":"99392b95-2df1-4600-afd1-c6a4f4d47e5c","Type":"ContainerDied","Data":"e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.605172 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" event={"ID":"99392b95-2df1-4600-afd1-c6a4f4d47e5c","Type":"ContainerDied","Data":"7d61e9358953928204894cb921dbaf815bc0ba5709aa7d5d9692bc6f9d718363"} Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.605226 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.633617 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-x2ndp\" (UniqueName: \"kubernetes.io/projected/949a611c-00dc-4dac-9068-0dc00cf79572-kube-api-access-x2ndp\") pod \"949a611c-00dc-4dac-9068-0dc00cf79572\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.633718 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-fernet-keys\") pod \"949a611c-00dc-4dac-9068-0dc00cf79572\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.633759 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-config-data\") pod \"949a611c-00dc-4dac-9068-0dc00cf79572\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.633816 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-scripts\") pod \"949a611c-00dc-4dac-9068-0dc00cf79572\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.633831 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-credential-keys\") pod \"949a611c-00dc-4dac-9068-0dc00cf79572\" (UID: \"949a611c-00dc-4dac-9068-0dc00cf79572\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.634558 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-vmqtk\" (UniqueName: \"kubernetes.io/projected/99392b95-2df1-4600-afd1-c6a4f4d47e5c-kube-api-access-vmqtk\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.634575 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.634583 4812 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/99392b95-2df1-4600-afd1-c6a4f4d47e5c-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.634647 4812 scope.go:117] "RemoveContainer" containerID="453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab" Jan 31 04:52:12 crc kubenswrapper[4812]: E0131 04:52:12.635154 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab\": container with ID starting with 453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab not found: ID does not exist" containerID="453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.635189 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab"} err="failed to get container status \"453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab\": rpc error: code = NotFound desc = could not find container \"453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab\": container with ID starting with 453146f57e21d7bd587e661141739d0229c65699c13ce38b10cadc6e64aa3aab not found: ID does not exist" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.635218 4812 scope.go:117] "RemoveContainer" containerID="7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.654305 4812 scope.go:117] "RemoveContainer" containerID="7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce" Jan 31 04:52:12 crc kubenswrapper[4812]: E0131 04:52:12.655593 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce\": container with ID starting with 7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce not found: ID does not exist" containerID="7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.655644 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce"} err="failed to get container status \"7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce\": rpc error: code = NotFound desc = could not find container \"7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce\": container with ID starting with 7696bdf03c2a1ca7766a38c6ff63891fd08ca520a24d68e050f966f34b19b9ce not found: ID does not exist" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.656524 4812 scope.go:117] "RemoveContainer" containerID="e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.663606 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-scripts" (OuterVolumeSpecName: "scripts") pod "949a611c-00dc-4dac-9068-0dc00cf79572" (UID: "949a611c-00dc-4dac-9068-0dc00cf79572"). InnerVolumeSpecName "scripts". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.663614 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-credential-keys" (OuterVolumeSpecName: "credential-keys") pod "949a611c-00dc-4dac-9068-0dc00cf79572" (UID: "949a611c-00dc-4dac-9068-0dc00cf79572"). InnerVolumeSpecName "credential-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.663655 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-fernet-keys" (OuterVolumeSpecName: "fernet-keys") pod "949a611c-00dc-4dac-9068-0dc00cf79572" (UID: "949a611c-00dc-4dac-9068-0dc00cf79572"). InnerVolumeSpecName "fernet-keys". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.663701 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/949a611c-00dc-4dac-9068-0dc00cf79572-kube-api-access-x2ndp" (OuterVolumeSpecName: "kube-api-access-x2ndp") pod "949a611c-00dc-4dac-9068-0dc00cf79572" (UID: "949a611c-00dc-4dac-9068-0dc00cf79572"). InnerVolumeSpecName "kube-api-access-x2ndp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.665550 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-config-data" (OuterVolumeSpecName: "config-data") pod "949a611c-00dc-4dac-9068-0dc00cf79572" (UID: "949a611c-00dc-4dac-9068-0dc00cf79572"). InnerVolumeSpecName "config-data". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.667505 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.673280 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-controller-manager-7d59b884bb-tn2x6"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.684886 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.691028 4812 scope.go:117] "RemoveContainer" containerID="e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c" Jan 31 04:52:12 crc kubenswrapper[4812]: E0131 04:52:12.691491 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c\": container with ID starting with e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c not found: ID does not exist" containerID="e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.691534 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c"} err="failed to get container status \"e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c\": rpc error: code = NotFound desc = could not find container \"e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c\": container with ID starting with e3c61fb2702629c50da6dc1b5bef75c9bef0b8d1c868aace5d81d3276a80498c not found: ID does not exist" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.692874 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/memcached-0"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.735722 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mhxdz\" (UniqueName: \"kubernetes.io/projected/a84ac0d3-4f66-44f0-8566-37dd9a31bb66-kube-api-access-mhxdz\") pod \"a84ac0d3-4f66-44f0-8566-37dd9a31bb66\" (UID: \"a84ac0d3-4f66-44f0-8566-37dd9a31bb66\") " Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.736021 4812 reconciler_common.go:293] "Volume detached for volume \"fernet-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-fernet-keys\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.736037 4812 reconciler_common.go:293] "Volume detached for volume \"config-data\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-config-data\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.736047 4812 reconciler_common.go:293] "Volume detached for volume \"scripts\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.736055 4812 reconciler_common.go:293] "Volume detached for volume \"credential-keys\" (UniqueName: \"kubernetes.io/secret/949a611c-00dc-4dac-9068-0dc00cf79572-credential-keys\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.736065 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-x2ndp\" (UniqueName: \"kubernetes.io/projected/949a611c-00dc-4dac-9068-0dc00cf79572-kube-api-access-x2ndp\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.738636 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a84ac0d3-4f66-44f0-8566-37dd9a31bb66-kube-api-access-mhxdz" (OuterVolumeSpecName: "kube-api-access-mhxdz") pod "a84ac0d3-4f66-44f0-8566-37dd9a31bb66" (UID: "a84ac0d3-4f66-44f0-8566-37dd9a31bb66"). InnerVolumeSpecName "kube-api-access-mhxdz". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.837583 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-mhxdz\" (UniqueName: \"kubernetes.io/projected/a84ac0d3-4f66-44f0-8566-37dd9a31bb66-kube-api-access-mhxdz\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.960939 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-5c49cbbfd-wfwms"] Jan 31 04:52:12 crc kubenswrapper[4812]: I0131 04:52:12.966119 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-5c49cbbfd-wfwms"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.255048 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.343814 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-erlang-cookie-secret\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.343963 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"persistence\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344013 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-plugins\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344037 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-confd\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344082 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6w8zm\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-kube-api-access-6w8zm\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344105 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-plugins-conf\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344123 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-pod-info\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344471 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-plugins" (OuterVolumeSpecName: "rabbitmq-plugins") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "rabbitmq-plugins". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344617 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-erlang-cookie\") pod \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\" (UID: \"9ee32a7e-691a-4b75-b7ae-e32b64c41b36\") " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344740 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-plugins-conf" (OuterVolumeSpecName: "plugins-conf") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "plugins-conf". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344975 4812 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-plugins\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-plugins\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.344994 4812 reconciler_common.go:293] "Volume detached for volume \"plugins-conf\" (UniqueName: \"kubernetes.io/configmap/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-plugins-conf\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.345188 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-erlang-cookie" (OuterVolumeSpecName: "rabbitmq-erlang-cookie") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "rabbitmq-erlang-cookie". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.347442 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-erlang-cookie-secret" (OuterVolumeSpecName: "erlang-cookie-secret") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "erlang-cookie-secret". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.349100 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/downward-api/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-pod-info" (OuterVolumeSpecName: "pod-info") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "pod-info". PluginName "kubernetes.io/downward-api", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.350026 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-kube-api-access-6w8zm" (OuterVolumeSpecName: "kube-api-access-6w8zm") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "kube-api-access-6w8zm". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.354294 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80" (OuterVolumeSpecName: "persistence") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80". PluginName "kubernetes.io/csi", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.427474 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-confd" (OuterVolumeSpecName: "rabbitmq-confd") pod "9ee32a7e-691a-4b75-b7ae-e32b64c41b36" (UID: "9ee32a7e-691a-4b75-b7ae-e32b64c41b36"). InnerVolumeSpecName "rabbitmq-confd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.446458 4812 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-confd\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-confd\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.446498 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-6w8zm\" (UniqueName: \"kubernetes.io/projected/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-kube-api-access-6w8zm\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.446509 4812 reconciler_common.go:293] "Volume detached for volume \"pod-info\" (UniqueName: \"kubernetes.io/downward-api/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-pod-info\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.446520 4812 reconciler_common.go:293] "Volume detached for volume \"rabbitmq-erlang-cookie\" (UniqueName: \"kubernetes.io/empty-dir/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-rabbitmq-erlang-cookie\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.446529 4812 reconciler_common.go:293] "Volume detached for volume \"erlang-cookie-secret\" (UniqueName: \"kubernetes.io/secret/9ee32a7e-691a-4b75-b7ae-e32b64c41b36-erlang-cookie-secret\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.446555 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") on node \"crc\" " Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.459875 4812 csi_attacher.go:630] kubernetes.io/csi: attacher.UnmountDevice STAGE_UNSTAGE_VOLUME capability not set. Skipping UnmountDevice... Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.460088 4812 operation_generator.go:917] UnmountDevice succeeded for volume "pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80" (UniqueName: "kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80") on node "crc" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.548410 4812 reconciler_common.go:293] "Volume detached for volume \"pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\" (UniqueName: \"kubernetes.io/csi/kubevirt.io.hostpath-provisioner^pvc-2ed9e180-993a-4ca4-b213-670c4e5a9c80\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.619342 4812 generic.go:334] "Generic (PLEG): container finished" podID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerID="086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d" exitCode=0 Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.619436 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/rabbitmq-server-0" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.619446 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"9ee32a7e-691a-4b75-b7ae-e32b64c41b36","Type":"ContainerDied","Data":"086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d"} Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.619532 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/rabbitmq-server-0" event={"ID":"9ee32a7e-691a-4b75-b7ae-e32b64c41b36","Type":"ContainerDied","Data":"26c2192037c4aae70592c14ee82aef35af6e4bd5e129e7b3fc890b0bf7bf9f2a"} Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.619555 4812 scope.go:117] "RemoveContainer" containerID="086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.622869 4812 generic.go:334] "Generic (PLEG): container finished" podID="efd185c8-37f1-4661-b631-524671bff15f" containerID="5b262d6b7da31fc5a2b2f4365848ada534c8499fe6dece82db9aa3201448502c" exitCode=0 Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.622958 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"efd185c8-37f1-4661-b631-524671bff15f","Type":"ContainerDied","Data":"5b262d6b7da31fc5a2b2f4365848ada534c8499fe6dece82db9aa3201448502c"} Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.629326 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/glance-operator-index-gqrtf" event={"ID":"a84ac0d3-4f66-44f0-8566-37dd9a31bb66","Type":"ContainerDied","Data":"7cebf7d9a0d92404c987af65ee7395672a289702e6dc15a351b27e757fc3bce1"} Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.629368 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/glance-operator-index-gqrtf" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.666647 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/glance-operator-index-gqrtf"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.667857 4812 scope.go:117] "RemoveContainer" containerID="26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.675494 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/glance-operator-index-gqrtf"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.692005 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.699079 4812 scope.go:117] "RemoveContainer" containerID="086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.699243 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/rabbitmq-server-0"] Jan 31 04:52:13 crc kubenswrapper[4812]: E0131 04:52:13.701425 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d\": container with ID starting with 086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d not found: ID does not exist" containerID="086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.701467 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d"} err="failed to get container status \"086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d\": rpc error: code = NotFound desc = could not find container \"086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d\": container with ID starting with 086a2e1159ba219c764df2d49b69060b2a65a70ac249c44ee14ac2da0d01204d not found: ID does not exist" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.701496 4812 scope.go:117] "RemoveContainer" containerID="26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072" Jan 31 04:52:13 crc kubenswrapper[4812]: E0131 04:52:13.702189 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072\": container with ID starting with 26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072 not found: ID does not exist" containerID="26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.702219 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072"} err="failed to get container status \"26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072\": rpc error: code = NotFound desc = could not find container \"26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072\": container with ID starting with 26990bb3e84d5bcb6b4be9a9ec6adb46cd28fa6e8b85423a26affc9539cfa072 not found: ID does not exist" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.702237 4812 scope.go:117] "RemoveContainer" containerID="7ab6ed6bbc496845b23c712bc1de59a49f4d382adcab786248a95bb9c417d985" Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.932741 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="glance-kuttl-tests/openstack-galera-0" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerName="galera" containerID="cri-o://5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae" gracePeriod=26 Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.940584 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-db-create-6m4g7"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.945748 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-db-create-6m4g7"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.961998 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone0ca8-account-delete-nmrnh"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.969433 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8"] Jan 31 04:52:13 crc kubenswrapper[4812]: I0131 04:52:13.976181 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone-0ca8-account-create-update-dgwc8"] Jan 31 04:52:14 crc kubenswrapper[4812]: E0131 04:52:14.261763 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-scripts: configmap "openstack-scripts" not found Jan 31 04:52:14 crc kubenswrapper[4812]: E0131 04:52:14.261897 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts podName:4f0da6d0-0d0b-40b3-a3f7-90470b94115b nodeName:}" failed. No retries permitted until 2026-01-31 04:52:18.261870899 +0000 UTC m=+1546.756892614 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "operator-scripts" (UniqueName: "kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts") pod "keystone0ca8-account-delete-nmrnh" (UID: "4f0da6d0-0d0b-40b3-a3f7-90470b94115b") : configmap "openstack-scripts" not found Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.277399 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.338404 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.338453 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.369276 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="014dcd69-d412-4c77-8a96-521ffc036f50" path="/var/lib/kubelet/pods/014dcd69-d412-4c77-8a96-521ffc036f50/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.371826 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8c6a9acd-6420-45da-9853-b06d5d4b053b" path="/var/lib/kubelet/pods/8c6a9acd-6420-45da-9853-b06d5d4b053b/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.373001 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="949a611c-00dc-4dac-9068-0dc00cf79572" path="/var/lib/kubelet/pods/949a611c-00dc-4dac-9068-0dc00cf79572/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.373720 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="95dc3255-b981-404f-8f25-641f12db9e86" path="/var/lib/kubelet/pods/95dc3255-b981-404f-8f25-641f12db9e86/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.374723 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="99392b95-2df1-4600-afd1-c6a4f4d47e5c" path="/var/lib/kubelet/pods/99392b95-2df1-4600-afd1-c6a4f4d47e5c/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.377906 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" path="/var/lib/kubelet/pods/9ee32a7e-691a-4b75-b7ae-e32b64c41b36/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.378519 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a84ac0d3-4f66-44f0-8566-37dd9a31bb66" path="/var/lib/kubelet/pods/a84ac0d3-4f66-44f0-8566-37dd9a31bb66/volumes" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.466776 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts\") pod \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.467003 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jg28q\" (UniqueName: \"kubernetes.io/projected/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-kube-api-access-jg28q\") pod \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\" (UID: \"4f0da6d0-0d0b-40b3-a3f7-90470b94115b\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.467652 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "4f0da6d0-0d0b-40b3-a3f7-90470b94115b" (UID: "4f0da6d0-0d0b-40b3-a3f7-90470b94115b"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.481806 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-kube-api-access-jg28q" (OuterVolumeSpecName: "kube-api-access-jg28q") pod "4f0da6d0-0d0b-40b3-a3f7-90470b94115b" (UID: "4f0da6d0-0d0b-40b3-a3f7-90470b94115b"). InnerVolumeSpecName "kube-api-access-jg28q". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.520378 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.569716 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.569750 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-jg28q\" (UniqueName: \"kubernetes.io/projected/4f0da6d0-0d0b-40b3-a3f7-90470b94115b-kube-api-access-jg28q\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.641774 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-1" event={"ID":"efd185c8-37f1-4661-b631-524671bff15f","Type":"ContainerDied","Data":"b56c1621e0a70d11272fb594cc08f91e48edacb4dafdf02da4f9adeefe297503"} Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.641865 4812 scope.go:117] "RemoveContainer" containerID="5b262d6b7da31fc5a2b2f4365848ada534c8499fe6dece82db9aa3201448502c" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.642036 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-1" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.652238 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" event={"ID":"4f0da6d0-0d0b-40b3-a3f7-90470b94115b","Type":"ContainerDied","Data":"1560edc753f5e8bf643d9b4d7ca612c740697ba191b460ae36062d04bf846de0"} Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.652351 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/keystone0ca8-account-delete-nmrnh" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.670883 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k6hlq\" (UniqueName: \"kubernetes.io/projected/efd185c8-37f1-4661-b631-524671bff15f-kube-api-access-k6hlq\") pod \"efd185c8-37f1-4661-b631-524671bff15f\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.670975 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-kolla-config\") pod \"efd185c8-37f1-4661-b631-524671bff15f\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.671073 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts\") pod \"efd185c8-37f1-4661-b631-524671bff15f\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.671115 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-config-data-default\") pod \"efd185c8-37f1-4661-b631-524671bff15f\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.671163 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") pod \"efd185c8-37f1-4661-b631-524671bff15f\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.671201 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/efd185c8-37f1-4661-b631-524671bff15f-config-data-generated\") pod \"efd185c8-37f1-4661-b631-524671bff15f\" (UID: \"efd185c8-37f1-4661-b631-524671bff15f\") " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.671571 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "efd185c8-37f1-4661-b631-524671bff15f" (UID: "efd185c8-37f1-4661-b631-524671bff15f"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.672541 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "efd185c8-37f1-4661-b631-524671bff15f" (UID: "efd185c8-37f1-4661-b631-524671bff15f"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.672600 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/efd185c8-37f1-4661-b631-524671bff15f-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "efd185c8-37f1-4661-b631-524671bff15f" (UID: "efd185c8-37f1-4661-b631-524671bff15f"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.673467 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "efd185c8-37f1-4661-b631-524671bff15f" (UID: "efd185c8-37f1-4661-b631-524671bff15f"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.677745 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/efd185c8-37f1-4661-b631-524671bff15f-kube-api-access-k6hlq" (OuterVolumeSpecName: "kube-api-access-k6hlq") pod "efd185c8-37f1-4661-b631-524671bff15f" (UID: "efd185c8-37f1-4661-b631-524671bff15f"). InnerVolumeSpecName "kube-api-access-k6hlq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.686325 4812 scope.go:117] "RemoveContainer" containerID="4bbd1c19288ea0a64cbcbbe90be9a0d57e6900bbfeba657dc1d439e2fccb2b8d" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.690180 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage17-crc" (OuterVolumeSpecName: "mysql-db") pod "efd185c8-37f1-4661-b631-524671bff15f" (UID: "efd185c8-37f1-4661-b631-524671bff15f"). InnerVolumeSpecName "local-storage17-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.694023 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/keystone0ca8-account-delete-nmrnh"] Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.700718 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/keystone0ca8-account-delete-nmrnh"] Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.708870 4812 scope.go:117] "RemoveContainer" containerID="19a8ce328eab525a9058852edc6ae8c6ad438e3de4bcc13a5b77bfe48cb2934b" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.773150 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" " Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.773203 4812 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/efd185c8-37f1-4661-b631-524671bff15f-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.773217 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-k6hlq\" (UniqueName: \"kubernetes.io/projected/efd185c8-37f1-4661-b631-524671bff15f-kube-api-access-k6hlq\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.773228 4812 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.773239 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.773251 4812 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/efd185c8-37f1-4661-b631-524671bff15f-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.784777 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage17-crc" (UniqueName: "kubernetes.io/local-volume/local-storage17-crc") on node "crc" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.875176 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage17-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage17-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:14 crc kubenswrapper[4812]: I0131 04:52:14.998799 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.008781 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-1"] Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.571048 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.661550 4812 generic.go:334] "Generic (PLEG): container finished" podID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerID="5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae" exitCode=0 Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.661590 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7666fda0-373e-4936-bd6f-ea26691ad9d5","Type":"ContainerDied","Data":"5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae"} Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.661611 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/openstack-galera-0" event={"ID":"7666fda0-373e-4936-bd6f-ea26691ad9d5","Type":"ContainerDied","Data":"4b50cdbdf9eea25def0c3b9e4c5dd09260e5bcf1a43a4b1045c1b3eac4ca831d"} Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.661627 4812 scope.go:117] "RemoveContainer" containerID="5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.661729 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/openstack-galera-0" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.679880 4812 scope.go:117] "RemoveContainer" containerID="719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.687739 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-generated\") pod \"7666fda0-373e-4936-bd6f-ea26691ad9d5\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.687786 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-default\") pod \"7666fda0-373e-4936-bd6f-ea26691ad9d5\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.687805 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwrbs\" (UniqueName: \"kubernetes.io/projected/7666fda0-373e-4936-bd6f-ea26691ad9d5-kube-api-access-dwrbs\") pod \"7666fda0-373e-4936-bd6f-ea26691ad9d5\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.687860 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-kolla-config\") pod \"7666fda0-373e-4936-bd6f-ea26691ad9d5\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.687902 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"mysql-db\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") pod \"7666fda0-373e-4936-bd6f-ea26691ad9d5\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.687965 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts\") pod \"7666fda0-373e-4936-bd6f-ea26691ad9d5\" (UID: \"7666fda0-373e-4936-bd6f-ea26691ad9d5\") " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.688652 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-kolla-config" (OuterVolumeSpecName: "kolla-config") pod "7666fda0-373e-4936-bd6f-ea26691ad9d5" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5"). InnerVolumeSpecName "kolla-config". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.688660 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-default" (OuterVolumeSpecName: "config-data-default") pod "7666fda0-373e-4936-bd6f-ea26691ad9d5" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5"). InnerVolumeSpecName "config-data-default". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.688819 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts" (OuterVolumeSpecName: "operator-scripts") pod "7666fda0-373e-4936-bd6f-ea26691ad9d5" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5"). InnerVolumeSpecName "operator-scripts". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.689034 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-generated" (OuterVolumeSpecName: "config-data-generated") pod "7666fda0-373e-4936-bd6f-ea26691ad9d5" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5"). InnerVolumeSpecName "config-data-generated". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.694035 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7666fda0-373e-4936-bd6f-ea26691ad9d5-kube-api-access-dwrbs" (OuterVolumeSpecName: "kube-api-access-dwrbs") pod "7666fda0-373e-4936-bd6f-ea26691ad9d5" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5"). InnerVolumeSpecName "kube-api-access-dwrbs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.698501 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage11-crc" (OuterVolumeSpecName: "mysql-db") pod "7666fda0-373e-4936-bd6f-ea26691ad9d5" (UID: "7666fda0-373e-4936-bd6f-ea26691ad9d5"). InnerVolumeSpecName "local-storage11-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.698531 4812 scope.go:117] "RemoveContainer" containerID="5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae" Jan 31 04:52:15 crc kubenswrapper[4812]: E0131 04:52:15.698903 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae\": container with ID starting with 5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae not found: ID does not exist" containerID="5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.698960 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae"} err="failed to get container status \"5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae\": rpc error: code = NotFound desc = could not find container \"5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae\": container with ID starting with 5fbff18c7f0605fcba04e7f0ac44bcd9aab02dab9fe89e996e87d191413cf0ae not found: ID does not exist" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.698989 4812 scope.go:117] "RemoveContainer" containerID="719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de" Jan 31 04:52:15 crc kubenswrapper[4812]: E0131 04:52:15.699284 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de\": container with ID starting with 719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de not found: ID does not exist" containerID="719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.699321 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de"} err="failed to get container status \"719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de\": rpc error: code = NotFound desc = could not find container \"719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de\": container with ID starting with 719ab4d97b7b492558ee4d02b2881cc3bbe63def0c559c8e351b4516376b25de not found: ID does not exist" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.789526 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" " Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.789858 4812 reconciler_common.go:293] "Volume detached for volume \"operator-scripts\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-operator-scripts\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.789873 4812 reconciler_common.go:293] "Volume detached for volume \"config-data-generated\" (UniqueName: \"kubernetes.io/empty-dir/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-generated\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.789881 4812 reconciler_common.go:293] "Volume detached for volume \"config-data-default\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-config-data-default\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.789891 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dwrbs\" (UniqueName: \"kubernetes.io/projected/7666fda0-373e-4936-bd6f-ea26691ad9d5-kube-api-access-dwrbs\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.789899 4812 reconciler_common.go:293] "Volume detached for volume \"kolla-config\" (UniqueName: \"kubernetes.io/configmap/7666fda0-373e-4936-bd6f-ea26691ad9d5-kolla-config\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.812266 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage11-crc" (UniqueName: "kubernetes.io/local-volume/local-storage11-crc") on node "crc" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.892096 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage11-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage11-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:15 crc kubenswrapper[4812]: I0131 04:52:15.998467 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 04:52:16 crc kubenswrapper[4812]: I0131 04:52:16.006235 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/openstack-galera-0"] Jan 31 04:52:16 crc kubenswrapper[4812]: I0131 04:52:16.348760 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" path="/var/lib/kubelet/pods/4f0da6d0-0d0b-40b3-a3f7-90470b94115b/volumes" Jan 31 04:52:16 crc kubenswrapper[4812]: I0131 04:52:16.349529 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" path="/var/lib/kubelet/pods/7666fda0-373e-4936-bd6f-ea26691ad9d5/volumes" Jan 31 04:52:16 crc kubenswrapper[4812]: I0131 04:52:16.350400 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="efd185c8-37f1-4661-b631-524671bff15f" path="/var/lib/kubelet/pods/efd185c8-37f1-4661-b631-524671bff15f/volumes" Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.349128 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2"] Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.349660 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" podUID="0921e685-9db0-446d-9ed9-9ac2016fffc2" containerName="manager" containerID="cri-o://b9eca0ee3a1c8d0d0edf1334d72b1aa44595672978076e3111bc0136618e21d0" gracePeriod=10 Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.613354 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-ffcjk"] Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.613743 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/swift-operator-index-ffcjk" podUID="ea90fec5-c635-4a58-9cc4-ff55147e2c26" containerName="registry-server" containerID="cri-o://394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470" gracePeriod=30 Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.674949 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw"] Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.684063 4812 generic.go:334] "Generic (PLEG): container finished" podID="0921e685-9db0-446d-9ed9-9ac2016fffc2" containerID="b9eca0ee3a1c8d0d0edf1334d72b1aa44595672978076e3111bc0136618e21d0" exitCode=0 Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.684102 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" event={"ID":"0921e685-9db0-446d-9ed9-9ac2016fffc2","Type":"ContainerDied","Data":"b9eca0ee3a1c8d0d0edf1334d72b1aa44595672978076e3111bc0136618e21d0"} Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.685355 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/70e8c782c05b28200f5f2de3cb5cb1e7b36c65af2b76ab17506213a5b47v5kw"] Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.806739 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.919767 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-apiservice-cert\") pod \"0921e685-9db0-446d-9ed9-9ac2016fffc2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.919896 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-webhook-cert\") pod \"0921e685-9db0-446d-9ed9-9ac2016fffc2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.919919 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w86fs\" (UniqueName: \"kubernetes.io/projected/0921e685-9db0-446d-9ed9-9ac2016fffc2-kube-api-access-w86fs\") pod \"0921e685-9db0-446d-9ed9-9ac2016fffc2\" (UID: \"0921e685-9db0-446d-9ed9-9ac2016fffc2\") " Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.926213 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0921e685-9db0-446d-9ed9-9ac2016fffc2-kube-api-access-w86fs" (OuterVolumeSpecName: "kube-api-access-w86fs") pod "0921e685-9db0-446d-9ed9-9ac2016fffc2" (UID: "0921e685-9db0-446d-9ed9-9ac2016fffc2"). InnerVolumeSpecName "kube-api-access-w86fs". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.928016 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "0921e685-9db0-446d-9ed9-9ac2016fffc2" (UID: "0921e685-9db0-446d-9ed9-9ac2016fffc2"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:17 crc kubenswrapper[4812]: I0131 04:52:17.929561 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "0921e685-9db0-446d-9ed9-9ac2016fffc2" (UID: "0921e685-9db0-446d-9ed9-9ac2016fffc2"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.021629 4812 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.021656 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/0921e685-9db0-446d-9ed9-9ac2016fffc2-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.021666 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w86fs\" (UniqueName: \"kubernetes.io/projected/0921e685-9db0-446d-9ed9-9ac2016fffc2-kube-api-access-w86fs\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.027607 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.122826 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w9fvb\" (UniqueName: \"kubernetes.io/projected/ea90fec5-c635-4a58-9cc4-ff55147e2c26-kube-api-access-w9fvb\") pod \"ea90fec5-c635-4a58-9cc4-ff55147e2c26\" (UID: \"ea90fec5-c635-4a58-9cc4-ff55147e2c26\") " Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.126516 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ea90fec5-c635-4a58-9cc4-ff55147e2c26-kube-api-access-w9fvb" (OuterVolumeSpecName: "kube-api-access-w9fvb") pod "ea90fec5-c635-4a58-9cc4-ff55147e2c26" (UID: "ea90fec5-c635-4a58-9cc4-ff55147e2c26"). InnerVolumeSpecName "kube-api-access-w9fvb". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.224407 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-w9fvb\" (UniqueName: \"kubernetes.io/projected/ea90fec5-c635-4a58-9cc4-ff55147e2c26-kube-api-access-w9fvb\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.354973 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e2a060a9-1558-46fd-b8eb-8b72af226b1f" path="/var/lib/kubelet/pods/e2a060a9-1558-46fd-b8eb-8b72af226b1f/volumes" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.692800 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" event={"ID":"0921e685-9db0-446d-9ed9-9ac2016fffc2","Type":"ContainerDied","Data":"1185970a060e074ec2015ba16c197498290f2e107ea4242d39f833a2c0abea3d"} Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.692902 4812 scope.go:117] "RemoveContainer" containerID="b9eca0ee3a1c8d0d0edf1334d72b1aa44595672978076e3111bc0136618e21d0" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.692911 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.695035 4812 generic.go:334] "Generic (PLEG): container finished" podID="ea90fec5-c635-4a58-9cc4-ff55147e2c26" containerID="394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470" exitCode=0 Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.695068 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/swift-operator-index-ffcjk" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.695092 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-ffcjk" event={"ID":"ea90fec5-c635-4a58-9cc4-ff55147e2c26","Type":"ContainerDied","Data":"394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470"} Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.695128 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/swift-operator-index-ffcjk" event={"ID":"ea90fec5-c635-4a58-9cc4-ff55147e2c26","Type":"ContainerDied","Data":"b9a2cc0204aa5f0aa80ab71b647a91d0491e9777208a957fc0dbde58a431eed2"} Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.721166 4812 scope.go:117] "RemoveContainer" containerID="394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.730787 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-index-ffcjk"] Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.740278 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-index-ffcjk"] Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.743404 4812 scope.go:117] "RemoveContainer" containerID="394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470" Jan 31 04:52:18 crc kubenswrapper[4812]: E0131 04:52:18.744011 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470\": container with ID starting with 394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470 not found: ID does not exist" containerID="394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.744060 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470"} err="failed to get container status \"394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470\": rpc error: code = NotFound desc = could not find container \"394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470\": container with ID starting with 394b23e4ec5efdd0666bcb4908af86327b17937fc827c069a097696bae091470 not found: ID does not exist" Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.747653 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2"] Jan 31 04:52:18 crc kubenswrapper[4812]: I0131 04:52:18.755134 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/swift-operator-controller-manager-6d8cb97c5-8h5w2"] Jan 31 04:52:20 crc kubenswrapper[4812]: I0131 04:52:20.350337 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0921e685-9db0-446d-9ed9-9ac2016fffc2" path="/var/lib/kubelet/pods/0921e685-9db0-446d-9ed9-9ac2016fffc2/volumes" Jan 31 04:52:20 crc kubenswrapper[4812]: I0131 04:52:20.351704 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ea90fec5-c635-4a58-9cc4-ff55147e2c26" path="/var/lib/kubelet/pods/ea90fec5-c635-4a58-9cc4-ff55147e2c26/volumes" Jan 31 04:52:20 crc kubenswrapper[4812]: I0131 04:52:20.897897 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm"] Jan 31 04:52:20 crc kubenswrapper[4812]: I0131 04:52:20.898125 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" podUID="bd8a58fc-0e84-4670-998e-a615bb248ff4" containerName="manager" containerID="cri-o://0c99ce29918cee64610786b51cfe485eb830675b8765f10622a13f85b783e46d" gracePeriod=10 Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.131036 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-kgblt"] Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.131600 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/keystone-operator-index-kgblt" podUID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" containerName="registry-server" containerID="cri-o://305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d" gracePeriod=30 Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.169009 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5"] Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.173105 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/b43f19b8e3bb8997a527070b172ae030accff9cd1a2f2b076f58d9c4efvvgt5"] Jan 31 04:52:21 crc kubenswrapper[4812]: E0131 04:52:21.704110 4812 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d is running failed: container process not found" containerID="305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 04:52:21 crc kubenswrapper[4812]: E0131 04:52:21.704734 4812 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d is running failed: container process not found" containerID="305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 04:52:21 crc kubenswrapper[4812]: E0131 04:52:21.705751 4812 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d is running failed: container process not found" containerID="305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 04:52:21 crc kubenswrapper[4812]: E0131 04:52:21.706110 4812 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of 305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d is running failed: container process not found" probeType="Readiness" pod="openstack-operators/keystone-operator-index-kgblt" podUID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" containerName="registry-server" Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.728799 4812 generic.go:334] "Generic (PLEG): container finished" podID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" containerID="305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d" exitCode=0 Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.728882 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-kgblt" event={"ID":"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b","Type":"ContainerDied","Data":"305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d"} Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.730430 4812 generic.go:334] "Generic (PLEG): container finished" podID="bd8a58fc-0e84-4670-998e-a615bb248ff4" containerID="0c99ce29918cee64610786b51cfe485eb830675b8765f10622a13f85b783e46d" exitCode=0 Jan 31 04:52:21 crc kubenswrapper[4812]: I0131 04:52:21.730457 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" event={"ID":"bd8a58fc-0e84-4670-998e-a615bb248ff4","Type":"ContainerDied","Data":"0c99ce29918cee64610786b51cfe485eb830675b8765f10622a13f85b783e46d"} Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.098355 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.178924 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.284610 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xfbzt\" (UniqueName: \"kubernetes.io/projected/bd8a58fc-0e84-4670-998e-a615bb248ff4-kube-api-access-xfbzt\") pod \"bd8a58fc-0e84-4670-998e-a615bb248ff4\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.284736 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-apiservice-cert\") pod \"bd8a58fc-0e84-4670-998e-a615bb248ff4\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.284808 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-webhook-cert\") pod \"bd8a58fc-0e84-4670-998e-a615bb248ff4\" (UID: \"bd8a58fc-0e84-4670-998e-a615bb248ff4\") " Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.284876 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v7n8p\" (UniqueName: \"kubernetes.io/projected/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b-kube-api-access-v7n8p\") pod \"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b\" (UID: \"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b\") " Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.290092 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bd8a58fc-0e84-4670-998e-a615bb248ff4-kube-api-access-xfbzt" (OuterVolumeSpecName: "kube-api-access-xfbzt") pod "bd8a58fc-0e84-4670-998e-a615bb248ff4" (UID: "bd8a58fc-0e84-4670-998e-a615bb248ff4"). InnerVolumeSpecName "kube-api-access-xfbzt". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.290227 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "bd8a58fc-0e84-4670-998e-a615bb248ff4" (UID: "bd8a58fc-0e84-4670-998e-a615bb248ff4"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.290386 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "bd8a58fc-0e84-4670-998e-a615bb248ff4" (UID: "bd8a58fc-0e84-4670-998e-a615bb248ff4"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.291134 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b-kube-api-access-v7n8p" (OuterVolumeSpecName: "kube-api-access-v7n8p") pod "928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" (UID: "928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b"). InnerVolumeSpecName "kube-api-access-v7n8p". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.349656 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ac3f10a5-54f7-47a3-b6b4-412f5eae07f8" path="/var/lib/kubelet/pods/ac3f10a5-54f7-47a3-b6b4-412f5eae07f8/volumes" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.385962 4812 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.386003 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/bd8a58fc-0e84-4670-998e-a615bb248ff4-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.386017 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v7n8p\" (UniqueName: \"kubernetes.io/projected/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b-kube-api-access-v7n8p\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.386031 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-xfbzt\" (UniqueName: \"kubernetes.io/projected/bd8a58fc-0e84-4670-998e-a615bb248ff4-kube-api-access-xfbzt\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.739969 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-index-kgblt" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.739958 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-index-kgblt" event={"ID":"928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b","Type":"ContainerDied","Data":"d130535c39f9b4bbed8fd34e4d59abdacf62747877ebc07862e17a886243ace1"} Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.740195 4812 scope.go:117] "RemoveContainer" containerID="305c921d15b24de325652e61af41748fe3449cebbc1fce3bd646266e96db5c7d" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.742436 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" event={"ID":"bd8a58fc-0e84-4670-998e-a615bb248ff4","Type":"ContainerDied","Data":"477d4ca55539926b6b212b61f5c6cd591d452349235bae6cb164474a21aabc3a"} Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.742580 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.767017 4812 scope.go:117] "RemoveContainer" containerID="0c99ce29918cee64610786b51cfe485eb830675b8765f10622a13f85b783e46d" Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.771329 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-index-kgblt"] Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.791983 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-index-kgblt"] Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.808796 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm"] Jan 31 04:52:22 crc kubenswrapper[4812]: I0131 04:52:22.818251 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/keystone-operator-controller-manager-c77b796c8-kvlqm"] Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.235873 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc"] Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.236410 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" podUID="5f08c58b-776d-4693-a282-e192ecc83bc2" containerName="operator" containerID="cri-o://4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab" gracePeriod=10 Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.351886 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" path="/var/lib/kubelet/pods/928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b/volumes" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.352479 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bd8a58fc-0e84-4670-998e-a615bb248ff4" path="/var/lib/kubelet/pods/bd8a58fc-0e84-4670-998e-a615bb248ff4/volumes" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.447130 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tg59z"] Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.447423 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" podUID="d74b15ea-c24b-479e-9469-ee16f0cc85f0" containerName="registry-server" containerID="cri-o://1142f1825e9bea38609ccde8f470500672c0dc3ca2026322ae9e5a66d9fe4fdd" gracePeriod=30 Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.467808 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v"] Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.476087 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/9704761d240e56fb98655ffd81084895b33a73ec711f4dcdef0450e5908hj9v"] Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.743225 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.769268 4812 generic.go:334] "Generic (PLEG): container finished" podID="d74b15ea-c24b-479e-9469-ee16f0cc85f0" containerID="1142f1825e9bea38609ccde8f470500672c0dc3ca2026322ae9e5a66d9fe4fdd" exitCode=0 Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.769332 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" event={"ID":"d74b15ea-c24b-479e-9469-ee16f0cc85f0","Type":"ContainerDied","Data":"1142f1825e9bea38609ccde8f470500672c0dc3ca2026322ae9e5a66d9fe4fdd"} Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.772084 4812 generic.go:334] "Generic (PLEG): container finished" podID="5f08c58b-776d-4693-a282-e192ecc83bc2" containerID="4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab" exitCode=0 Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.772109 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" event={"ID":"5f08c58b-776d-4693-a282-e192ecc83bc2","Type":"ContainerDied","Data":"4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab"} Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.772125 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" event={"ID":"5f08c58b-776d-4693-a282-e192ecc83bc2","Type":"ContainerDied","Data":"b70424e23323900a46dc8c57e4ef9e52a85ac47d08b84b4ec9c317010c8c4a9e"} Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.772141 4812 scope.go:117] "RemoveContainer" containerID="4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.772214 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.793623 4812 scope.go:117] "RemoveContainer" containerID="4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab" Jan 31 04:52:24 crc kubenswrapper[4812]: E0131 04:52:24.794040 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab\": container with ID starting with 4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab not found: ID does not exist" containerID="4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.794074 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab"} err="failed to get container status \"4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab\": rpc error: code = NotFound desc = could not find container \"4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab\": container with ID starting with 4a514b5f73020ef907baf9e06f5843b74a347163e92c890e171ef860b9be02ab not found: ID does not exist" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.878262 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.919908 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gndzh\" (UniqueName: \"kubernetes.io/projected/5f08c58b-776d-4693-a282-e192ecc83bc2-kube-api-access-gndzh\") pod \"5f08c58b-776d-4693-a282-e192ecc83bc2\" (UID: \"5f08c58b-776d-4693-a282-e192ecc83bc2\") " Jan 31 04:52:24 crc kubenswrapper[4812]: I0131 04:52:24.925283 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5f08c58b-776d-4693-a282-e192ecc83bc2-kube-api-access-gndzh" (OuterVolumeSpecName: "kube-api-access-gndzh") pod "5f08c58b-776d-4693-a282-e192ecc83bc2" (UID: "5f08c58b-776d-4693-a282-e192ecc83bc2"). InnerVolumeSpecName "kube-api-access-gndzh". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.021117 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bgkcw\" (UniqueName: \"kubernetes.io/projected/d74b15ea-c24b-479e-9469-ee16f0cc85f0-kube-api-access-bgkcw\") pod \"d74b15ea-c24b-479e-9469-ee16f0cc85f0\" (UID: \"d74b15ea-c24b-479e-9469-ee16f0cc85f0\") " Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.021771 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gndzh\" (UniqueName: \"kubernetes.io/projected/5f08c58b-776d-4693-a282-e192ecc83bc2-kube-api-access-gndzh\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.024321 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d74b15ea-c24b-479e-9469-ee16f0cc85f0-kube-api-access-bgkcw" (OuterVolumeSpecName: "kube-api-access-bgkcw") pod "d74b15ea-c24b-479e-9469-ee16f0cc85f0" (UID: "d74b15ea-c24b-479e-9469-ee16f0cc85f0"). InnerVolumeSpecName "kube-api-access-bgkcw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.102064 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc"] Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.107582 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-779fc9694b-7jhbc"] Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.124341 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bgkcw\" (UniqueName: \"kubernetes.io/projected/d74b15ea-c24b-479e-9469-ee16f0cc85f0-kube-api-access-bgkcw\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.780862 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" event={"ID":"d74b15ea-c24b-479e-9469-ee16f0cc85f0","Type":"ContainerDied","Data":"0751313fe43ac721041bc470e16461bf007216cc0354b53545fae3718f89cbfd"} Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.780914 4812 scope.go:117] "RemoveContainer" containerID="1142f1825e9bea38609ccde8f470500672c0dc3ca2026322ae9e5a66d9fe4fdd" Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.781024 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/rabbitmq-cluster-operator-index-tg59z" Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.811234 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tg59z"] Jan 31 04:52:25 crc kubenswrapper[4812]: I0131 04:52:25.818743 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/rabbitmq-cluster-operator-index-tg59z"] Jan 31 04:52:26 crc kubenswrapper[4812]: I0131 04:52:26.346500 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5f08c58b-776d-4693-a282-e192ecc83bc2" path="/var/lib/kubelet/pods/5f08c58b-776d-4693-a282-e192ecc83bc2/volumes" Jan 31 04:52:26 crc kubenswrapper[4812]: I0131 04:52:26.347136 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ee7c2e6-bdce-4bd4-80f9-7745b0072f56" path="/var/lib/kubelet/pods/8ee7c2e6-bdce-4bd4-80f9-7745b0072f56/volumes" Jan 31 04:52:26 crc kubenswrapper[4812]: I0131 04:52:26.347828 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d74b15ea-c24b-479e-9469-ee16f0cc85f0" path="/var/lib/kubelet/pods/d74b15ea-c24b-479e-9469-ee16f0cc85f0/volumes" Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.395470 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv"] Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.396136 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" podUID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" containerName="manager" containerID="cri-o://1aa87d438681fa7652fb4829d433806c2725d1cc2cf3fc164d33d5065858be90" gracePeriod=10 Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.638881 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-f6vmn"] Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.639102 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/infra-operator-index-f6vmn" podUID="0771cbbe-eeee-435e-8740-edab15c2484c" containerName="registry-server" containerID="cri-o://cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005" gracePeriod=30 Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.670272 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977"] Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.675552 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/d7c3b59ed6c2e571e21460d743e5fcd0c5f76cb7c446e474a3d05f7576nr977"] Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.852381 4812 generic.go:334] "Generic (PLEG): container finished" podID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" containerID="1aa87d438681fa7652fb4829d433806c2725d1cc2cf3fc164d33d5065858be90" exitCode=0 Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.852474 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" event={"ID":"d72ee50b-10a1-4e23-b0eb-ec5227c4f740","Type":"ContainerDied","Data":"1aa87d438681fa7652fb4829d433806c2725d1cc2cf3fc164d33d5065858be90"} Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.854501 4812 generic.go:334] "Generic (PLEG): container finished" podID="0771cbbe-eeee-435e-8740-edab15c2484c" containerID="cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005" exitCode=0 Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.854539 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-f6vmn" event={"ID":"0771cbbe-eeee-435e-8740-edab15c2484c","Type":"ContainerDied","Data":"cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005"} Jan 31 04:52:32 crc kubenswrapper[4812]: E0131 04:52:32.893807 4812 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005 is running failed: container process not found" containerID="cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 04:52:32 crc kubenswrapper[4812]: E0131 04:52:32.894209 4812 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005 is running failed: container process not found" containerID="cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 04:52:32 crc kubenswrapper[4812]: E0131 04:52:32.894514 4812 log.go:32] "ExecSync cmd from runtime service failed" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005 is running failed: container process not found" containerID="cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005" cmd=["grpc_health_probe","-addr=:50051"] Jan 31 04:52:32 crc kubenswrapper[4812]: E0131 04:52:32.894551 4812 prober.go:104] "Probe errored" err="rpc error: code = NotFound desc = container is not created or running: checking if PID of cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005 is running failed: container process not found" probeType="Readiness" pod="openstack-operators/infra-operator-index-f6vmn" podUID="0771cbbe-eeee-435e-8740-edab15c2484c" containerName="registry-server" Jan 31 04:52:32 crc kubenswrapper[4812]: I0131 04:52:32.914489 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.028387 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dlh54\" (UniqueName: \"kubernetes.io/projected/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-kube-api-access-dlh54\") pod \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.028508 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-webhook-cert\") pod \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.028590 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-apiservice-cert\") pod \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\" (UID: \"d72ee50b-10a1-4e23-b0eb-ec5227c4f740\") " Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.036230 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "d72ee50b-10a1-4e23-b0eb-ec5227c4f740" (UID: "d72ee50b-10a1-4e23-b0eb-ec5227c4f740"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.050473 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-kube-api-access-dlh54" (OuterVolumeSpecName: "kube-api-access-dlh54") pod "d72ee50b-10a1-4e23-b0eb-ec5227c4f740" (UID: "d72ee50b-10a1-4e23-b0eb-ec5227c4f740"). InnerVolumeSpecName "kube-api-access-dlh54". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.051279 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "d72ee50b-10a1-4e23-b0eb-ec5227c4f740" (UID: "d72ee50b-10a1-4e23-b0eb-ec5227c4f740"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.086430 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.129786 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-dlh54\" (UniqueName: \"kubernetes.io/projected/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-kube-api-access-dlh54\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.129831 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.129860 4812 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/d72ee50b-10a1-4e23-b0eb-ec5227c4f740-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.230940 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qgddp\" (UniqueName: \"kubernetes.io/projected/0771cbbe-eeee-435e-8740-edab15c2484c-kube-api-access-qgddp\") pod \"0771cbbe-eeee-435e-8740-edab15c2484c\" (UID: \"0771cbbe-eeee-435e-8740-edab15c2484c\") " Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.233616 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0771cbbe-eeee-435e-8740-edab15c2484c-kube-api-access-qgddp" (OuterVolumeSpecName: "kube-api-access-qgddp") pod "0771cbbe-eeee-435e-8740-edab15c2484c" (UID: "0771cbbe-eeee-435e-8740-edab15c2484c"). InnerVolumeSpecName "kube-api-access-qgddp". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.332690 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-qgddp\" (UniqueName: \"kubernetes.io/projected/0771cbbe-eeee-435e-8740-edab15c2484c-kube-api-access-qgddp\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.861895 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-index-f6vmn" event={"ID":"0771cbbe-eeee-435e-8740-edab15c2484c","Type":"ContainerDied","Data":"b505813ddc93efbe6cf41064deef7bb997932980b152cf0847e816e89732c279"} Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.862402 4812 scope.go:117] "RemoveContainer" containerID="cfbc2f134e7ad8ee8608b6d048710c52a3841b1406d40fc1d2b57ff8fe594005" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.862414 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-index-f6vmn" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.863910 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" event={"ID":"d72ee50b-10a1-4e23-b0eb-ec5227c4f740","Type":"ContainerDied","Data":"712ebe9f5aab441d766abb60305028907417676d719d86b026bc227c9c13f1a0"} Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.863945 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.864285 4812 prober.go:107] "Probe failed" probeType="Readiness" pod="openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv" podUID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" containerName="manager" probeResult="failure" output="Get \"http://10.217.0.50:8081/readyz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.883071 4812 scope.go:117] "RemoveContainer" containerID="1aa87d438681fa7652fb4829d433806c2725d1cc2cf3fc164d33d5065858be90" Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.907737 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-index-f6vmn"] Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.914334 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-index-f6vmn"] Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.918002 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv"] Jan 31 04:52:33 crc kubenswrapper[4812]: I0131 04:52:33.921315 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/infra-operator-controller-manager-748684b5b6-cr6xv"] Jan 31 04:52:34 crc kubenswrapper[4812]: I0131 04:52:34.350080 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0771cbbe-eeee-435e-8740-edab15c2484c" path="/var/lib/kubelet/pods/0771cbbe-eeee-435e-8740-edab15c2484c/volumes" Jan 31 04:52:34 crc kubenswrapper[4812]: I0131 04:52:34.350738 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7b0f44c0-ce3b-44a9-b345-4143577af2f2" path="/var/lib/kubelet/pods/7b0f44c0-ce3b-44a9-b345-4143577af2f2/volumes" Jan 31 04:52:34 crc kubenswrapper[4812]: I0131 04:52:34.351496 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" path="/var/lib/kubelet/pods/d72ee50b-10a1-4e23-b0eb-ec5227c4f740/volumes" Jan 31 04:52:34 crc kubenswrapper[4812]: I0131 04:52:34.912765 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4"] Jan 31 04:52:34 crc kubenswrapper[4812]: I0131 04:52:34.913085 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" podUID="c2d7a051-0941-42a6-82dc-76cfa73c185d" containerName="manager" containerID="cri-o://0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4" gracePeriod=10 Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.151501 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-788ws"] Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.151721 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openstack-operators/mariadb-operator-index-788ws" podUID="65cc13cf-2366-4670-ac64-27e8ffa38afc" containerName="registry-server" containerID="cri-o://2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d" gracePeriod=30 Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.180077 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t"] Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.184427 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/f5f7435db1a968bc2e4b919cf4f5a8f6719d9ac995e6b095f5b2e84f408zk7t"] Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.350926 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.457787 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-apiservice-cert\") pod \"c2d7a051-0941-42a6-82dc-76cfa73c185d\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.457882 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7s9jw\" (UniqueName: \"kubernetes.io/projected/c2d7a051-0941-42a6-82dc-76cfa73c185d-kube-api-access-7s9jw\") pod \"c2d7a051-0941-42a6-82dc-76cfa73c185d\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.457920 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-webhook-cert\") pod \"c2d7a051-0941-42a6-82dc-76cfa73c185d\" (UID: \"c2d7a051-0941-42a6-82dc-76cfa73c185d\") " Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.464579 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-apiservice-cert" (OuterVolumeSpecName: "apiservice-cert") pod "c2d7a051-0941-42a6-82dc-76cfa73c185d" (UID: "c2d7a051-0941-42a6-82dc-76cfa73c185d"). InnerVolumeSpecName "apiservice-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.466657 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-webhook-cert" (OuterVolumeSpecName: "webhook-cert") pod "c2d7a051-0941-42a6-82dc-76cfa73c185d" (UID: "c2d7a051-0941-42a6-82dc-76cfa73c185d"). InnerVolumeSpecName "webhook-cert". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.469682 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2d7a051-0941-42a6-82dc-76cfa73c185d-kube-api-access-7s9jw" (OuterVolumeSpecName: "kube-api-access-7s9jw") pod "c2d7a051-0941-42a6-82dc-76cfa73c185d" (UID: "c2d7a051-0941-42a6-82dc-76cfa73c185d"). InnerVolumeSpecName "kube-api-access-7s9jw". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.559645 4812 reconciler_common.go:293] "Volume detached for volume \"apiservice-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-apiservice-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.559678 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-7s9jw\" (UniqueName: \"kubernetes.io/projected/c2d7a051-0941-42a6-82dc-76cfa73c185d-kube-api-access-7s9jw\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.559690 4812 reconciler_common.go:293] "Volume detached for volume \"webhook-cert\" (UniqueName: \"kubernetes.io/secret/c2d7a051-0941-42a6-82dc-76cfa73c185d-webhook-cert\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.568528 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.762171 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l4sjd\" (UniqueName: \"kubernetes.io/projected/65cc13cf-2366-4670-ac64-27e8ffa38afc-kube-api-access-l4sjd\") pod \"65cc13cf-2366-4670-ac64-27e8ffa38afc\" (UID: \"65cc13cf-2366-4670-ac64-27e8ffa38afc\") " Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.765199 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/65cc13cf-2366-4670-ac64-27e8ffa38afc-kube-api-access-l4sjd" (OuterVolumeSpecName: "kube-api-access-l4sjd") pod "65cc13cf-2366-4670-ac64-27e8ffa38afc" (UID: "65cc13cf-2366-4670-ac64-27e8ffa38afc"). InnerVolumeSpecName "kube-api-access-l4sjd". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.863235 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-l4sjd\" (UniqueName: \"kubernetes.io/projected/65cc13cf-2366-4670-ac64-27e8ffa38afc-kube-api-access-l4sjd\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.907644 4812 generic.go:334] "Generic (PLEG): container finished" podID="c2d7a051-0941-42a6-82dc-76cfa73c185d" containerID="0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4" exitCode=0 Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.907707 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.907713 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" event={"ID":"c2d7a051-0941-42a6-82dc-76cfa73c185d","Type":"ContainerDied","Data":"0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4"} Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.907822 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4" event={"ID":"c2d7a051-0941-42a6-82dc-76cfa73c185d","Type":"ContainerDied","Data":"0218d70c63fa48fdfa752970a4cbdc30d48b28f6d1bebbfdec8db8e3db950e40"} Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.907857 4812 scope.go:117] "RemoveContainer" containerID="0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.909264 4812 generic.go:334] "Generic (PLEG): container finished" podID="65cc13cf-2366-4670-ac64-27e8ffa38afc" containerID="2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d" exitCode=0 Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.909282 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-788ws" event={"ID":"65cc13cf-2366-4670-ac64-27e8ffa38afc","Type":"ContainerDied","Data":"2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d"} Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.909299 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openstack-operators/mariadb-operator-index-788ws" event={"ID":"65cc13cf-2366-4670-ac64-27e8ffa38afc","Type":"ContainerDied","Data":"cbda4954c4e0132714bc4c7decdd804416e8cf10437cfa25e87402b6d9237f2a"} Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.909324 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openstack-operators/mariadb-operator-index-788ws" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.929374 4812 scope.go:117] "RemoveContainer" containerID="0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4" Jan 31 04:52:35 crc kubenswrapper[4812]: E0131 04:52:35.930755 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4\": container with ID starting with 0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4 not found: ID does not exist" containerID="0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.930826 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4"} err="failed to get container status \"0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4\": rpc error: code = NotFound desc = could not find container \"0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4\": container with ID starting with 0871b38156f4517afa956d95956df1ef948768be79d8a5723a50fc67d199aed4 not found: ID does not exist" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.930927 4812 scope.go:117] "RemoveContainer" containerID="2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.940089 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4"] Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.947971 4812 scope.go:117] "RemoveContainer" containerID="2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.948106 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-controller-manager-75bc68fcbf-78zf4"] Jan 31 04:52:35 crc kubenswrapper[4812]: E0131 04:52:35.948682 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d\": container with ID starting with 2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d not found: ID does not exist" containerID="2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.948721 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d"} err="failed to get container status \"2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d\": rpc error: code = NotFound desc = could not find container \"2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d\": container with ID starting with 2b55e0bbd01d003a31c788e69f1a3b884d926e2e2dc1e83c9186ec048e50531d not found: ID does not exist" Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.954774 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openstack-operators/mariadb-operator-index-788ws"] Jan 31 04:52:35 crc kubenswrapper[4812]: I0131 04:52:35.958190 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openstack-operators/mariadb-operator-index-788ws"] Jan 31 04:52:36 crc kubenswrapper[4812]: I0131 04:52:36.349444 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="118ced32-01e9-43b6-b6d4-8588fd59d6e7" path="/var/lib/kubelet/pods/118ced32-01e9-43b6-b6d4-8588fd59d6e7/volumes" Jan 31 04:52:36 crc kubenswrapper[4812]: I0131 04:52:36.351568 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="65cc13cf-2366-4670-ac64-27e8ffa38afc" path="/var/lib/kubelet/pods/65cc13cf-2366-4670-ac64-27e8ffa38afc/volumes" Jan 31 04:52:36 crc kubenswrapper[4812]: I0131 04:52:36.352765 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2d7a051-0941-42a6-82dc-76cfa73c185d" path="/var/lib/kubelet/pods/c2d7a051-0941-42a6-82dc-76cfa73c185d/volumes" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.862531 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.934815 4812 generic.go:334] "Generic (PLEG): container finished" podID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerID="54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c" exitCode=137 Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.934881 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c"} Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.934921 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="glance-kuttl-tests/swift-storage-0" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.934942 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="glance-kuttl-tests/swift-storage-0" event={"ID":"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e","Type":"ContainerDied","Data":"5db0c7dca63414ed0916fd410bc4bb5f962aa194724d39e7b495d953d7230c31"} Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.934996 4812 scope.go:117] "RemoveContainer" containerID="54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.962409 4812 scope.go:117] "RemoveContainer" containerID="9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.980639 4812 scope.go:117] "RemoveContainer" containerID="055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.994134 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-cache\") pod \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.994213 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"swift\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") pod \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.994271 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wbxck\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-kube-api-access-wbxck\") pod \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.994323 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") pod \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.994367 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-lock\") pod \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\" (UID: \"bab986d4-81ba-4f72-a3fc-3b0cdb004c6e\") " Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.995223 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-lock" (OuterVolumeSpecName: "lock") pod "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e"). InnerVolumeSpecName "lock". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:37 crc kubenswrapper[4812]: I0131 04:52:37.995579 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-cache" (OuterVolumeSpecName: "cache") pod "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e"). InnerVolumeSpecName "cache". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.001572 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-kube-api-access-wbxck" (OuterVolumeSpecName: "kube-api-access-wbxck") pod "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e"). InnerVolumeSpecName "kube-api-access-wbxck". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005047 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift" (OuterVolumeSpecName: "etc-swift") pod "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e"). InnerVolumeSpecName "etc-swift". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005151 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-62b5p"] Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005290 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/local-volume/local-storage15-crc" (OuterVolumeSpecName: "swift") pod "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" (UID: "bab986d4-81ba-4f72-a3fc-3b0cdb004c6e"). InnerVolumeSpecName "local-storage15-crc". PluginName "kubernetes.io/local-volume", VolumeGidValue "" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005468 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-reaper" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005506 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-reaper" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005521 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="65cc13cf-2366-4670-ac64-27e8ffa38afc" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005526 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="65cc13cf-2366-4670-ac64-27e8ffa38afc" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005537 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005543 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005550 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="99392b95-2df1-4600-afd1-c6a4f4d47e5c" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005556 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="99392b95-2df1-4600-afd1-c6a4f4d47e5c" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005587 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005594 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005602 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd185c8-37f1-4661-b631-524671bff15f" containerName="mysql-bootstrap" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005609 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd185c8-37f1-4661-b631-524671bff15f" containerName="mysql-bootstrap" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005621 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerName="mysql-bootstrap" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005629 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerName="mysql-bootstrap" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005669 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005682 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005693 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerName="mariadb-account-delete" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005701 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerName="mariadb-account-delete" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005710 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005716 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005753 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-expirer" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005761 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-expirer" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005773 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="swift-recon-cron" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005778 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="swift-recon-cron" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005789 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5f08c58b-776d-4693-a282-e192ecc83bc2" containerName="operator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005794 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5f08c58b-776d-4693-a282-e192ecc83bc2" containerName="operator" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005804 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005852 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005863 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d74b15ea-c24b-479e-9469-ee16f0cc85f0" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005871 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d74b15ea-c24b-479e-9469-ee16f0cc85f0" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005877 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="mysql-bootstrap" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005884 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="mysql-bootstrap" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005923 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bd8a58fc-0e84-4670-998e-a615bb248ff4" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005929 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bd8a58fc-0e84-4670-998e-a615bb248ff4" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005936 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005942 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005950 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0771cbbe-eeee-435e-8740-edab15c2484c" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005955 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0771cbbe-eeee-435e-8740-edab15c2484c" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.005964 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerName="setup-container" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.005969 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerName="setup-container" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006007 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-updater" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006014 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-updater" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006021 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ea90fec5-c635-4a58-9cc4-ff55147e2c26" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006026 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ea90fec5-c635-4a58-9cc4-ff55147e2c26" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006033 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-updater" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006039 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-updater" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006049 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="c2d7a051-0941-42a6-82dc-76cfa73c185d" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006055 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2d7a051-0941-42a6-82dc-76cfa73c185d" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006088 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="efd185c8-37f1-4661-b631-524671bff15f" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006094 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="efd185c8-37f1-4661-b631-524671bff15f" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006105 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="949a611c-00dc-4dac-9068-0dc00cf79572" containerName="keystone-api" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006111 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="949a611c-00dc-4dac-9068-0dc00cf79572" containerName="keystone-api" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006116 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="014dcd69-d412-4c77-8a96-521ffc036f50" containerName="memcached" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006121 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="014dcd69-d412-4c77-8a96-521ffc036f50" containerName="memcached" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006130 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="a84ac0d3-4f66-44f0-8566-37dd9a31bb66" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006136 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="a84ac0d3-4f66-44f0-8566-37dd9a31bb66" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006167 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="rsync" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006173 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="rsync" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006182 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006188 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006196 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006202 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006210 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006216 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006248 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006254 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006265 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerName="rabbitmq" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006270 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerName="rabbitmq" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006278 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0921e685-9db0-446d-9ed9-9ac2016fffc2" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006284 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0921e685-9db0-446d-9ed9-9ac2016fffc2" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006290 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006296 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006328 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006335 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-server" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.006342 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006348 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006540 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006571 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006578 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="949a611c-00dc-4dac-9068-0dc00cf79572" containerName="keystone-api" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006588 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="0921e685-9db0-446d-9ed9-9ac2016fffc2" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006596 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="a84ac0d3-4f66-44f0-8566-37dd9a31bb66" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006602 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="ea90fec5-c635-4a58-9cc4-ff55147e2c26" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006611 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="efd185c8-37f1-4661-b631-524671bff15f" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006617 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="0771cbbe-eeee-435e-8740-edab15c2484c" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006646 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="rsync" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006656 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="c2d7a051-0941-42a6-82dc-76cfa73c185d" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006668 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006679 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-auditor" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006686 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006693 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006701 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="65cc13cf-2366-4670-ac64-27e8ffa38afc" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006729 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="928c7fc4-4d7d-43b9-9ae9-6b350dd4be1b" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006738 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="99392b95-2df1-4600-afd1-c6a4f4d47e5c" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006748 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-expirer" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006755 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d72ee50b-10a1-4e23-b0eb-ec5227c4f740" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006763 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerName="mariadb-account-delete" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006768 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="9ee32a7e-691a-4b75-b7ae-e32b64c41b36" containerName="rabbitmq" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006776 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d74b15ea-c24b-479e-9469-ee16f0cc85f0" containerName="registry-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006782 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-updater" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006808 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="014dcd69-d412-4c77-8a96-521ffc036f50" containerName="memcached" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006815 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5f08c58b-776d-4693-a282-e192ecc83bc2" containerName="operator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006822 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bd8a58fc-0e84-4670-998e-a615bb248ff4" containerName="manager" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006828 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="container-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006857 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-reaper" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006866 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-server" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006872 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="swift-recon-cron" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006881 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="546d2f27-dfde-4446-978b-19b2e6a1d6a0" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006887 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="account-replicator" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006894 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="7666fda0-373e-4936-bd6f-ea26691ad9d5" containerName="galera" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.006901 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" containerName="object-updater" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.007041 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerName="mariadb-account-delete" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.007051 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerName="mariadb-account-delete" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.007207 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4f0da6d0-0d0b-40b3-a3f7-90470b94115b" containerName="mariadb-account-delete" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.007992 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.010734 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-62b5p"] Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.014059 4812 scope.go:117] "RemoveContainer" containerID="4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.056667 4812 scope.go:117] "RemoveContainer" containerID="8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.072954 4812 scope.go:117] "RemoveContainer" containerID="db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.086513 4812 scope.go:117] "RemoveContainer" containerID="fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.095720 4812 reconciler_common.go:293] "Volume detached for volume \"etc-swift\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-etc-swift\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.095748 4812 reconciler_common.go:293] "Volume detached for volume \"lock\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-lock\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.095757 4812 reconciler_common.go:293] "Volume detached for volume \"cache\" (UniqueName: \"kubernetes.io/empty-dir/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-cache\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.095786 4812 reconciler_common.go:286] "operationExecutor.UnmountDevice started for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" " Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.095796 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-wbxck\" (UniqueName: \"kubernetes.io/projected/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e-kube-api-access-wbxck\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.104120 4812 scope.go:117] "RemoveContainer" containerID="62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.105462 4812 operation_generator.go:917] UnmountDevice succeeded for volume "local-storage15-crc" (UniqueName: "kubernetes.io/local-volume/local-storage15-crc") on node "crc" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.119642 4812 scope.go:117] "RemoveContainer" containerID="3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.138004 4812 scope.go:117] "RemoveContainer" containerID="552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.162180 4812 scope.go:117] "RemoveContainer" containerID="46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.181260 4812 scope.go:117] "RemoveContainer" containerID="7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.196510 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-catalog-content\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.196632 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gwd9t\" (UniqueName: \"kubernetes.io/projected/907bac78-f420-4fb6-93f4-20db94eee8cd-kube-api-access-gwd9t\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.196661 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-utilities\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.196703 4812 reconciler_common.go:293] "Volume detached for volume \"local-storage15-crc\" (UniqueName: \"kubernetes.io/local-volume/local-storage15-crc\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.200594 4812 scope.go:117] "RemoveContainer" containerID="d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.218935 4812 scope.go:117] "RemoveContainer" containerID="cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.232758 4812 scope.go:117] "RemoveContainer" containerID="6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.258554 4812 scope.go:117] "RemoveContainer" containerID="54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.260541 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c\": container with ID starting with 54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c not found: ID does not exist" containerID="54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.260581 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c"} err="failed to get container status \"54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c\": rpc error: code = NotFound desc = could not find container \"54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c\": container with ID starting with 54635704bee638205d33a76a0bc422a7e680e315dafc4fe66148afb221b12c9c not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.260608 4812 scope.go:117] "RemoveContainer" containerID="9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.261159 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc\": container with ID starting with 9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc not found: ID does not exist" containerID="9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.261187 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc"} err="failed to get container status \"9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc\": rpc error: code = NotFound desc = could not find container \"9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc\": container with ID starting with 9932a0ab3e41a65a1fe61d27f2292ebbe5893c76e70d914752eb60672a26cabc not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.261203 4812 scope.go:117] "RemoveContainer" containerID="055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.263038 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53\": container with ID starting with 055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53 not found: ID does not exist" containerID="055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.263082 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53"} err="failed to get container status \"055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53\": rpc error: code = NotFound desc = could not find container \"055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53\": container with ID starting with 055480b3bcea455472ce1292098bb6d9db1632cf841ebc889f225a4cbb050e53 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.263111 4812 scope.go:117] "RemoveContainer" containerID="4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.263472 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3\": container with ID starting with 4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3 not found: ID does not exist" containerID="4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.263505 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3"} err="failed to get container status \"4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3\": rpc error: code = NotFound desc = could not find container \"4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3\": container with ID starting with 4149b2f95acef8dc11c87809a49165512c071fe08e4da8439042fdacd3de29c3 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.263524 4812 scope.go:117] "RemoveContainer" containerID="8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.263879 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1\": container with ID starting with 8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1 not found: ID does not exist" containerID="8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.263929 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1"} err="failed to get container status \"8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1\": rpc error: code = NotFound desc = could not find container \"8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1\": container with ID starting with 8841b4c30fb782a3a707962f95310727488f27d185b0834f3e9c71a91e7ed3a1 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.263978 4812 scope.go:117] "RemoveContainer" containerID="db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.264274 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab\": container with ID starting with db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab not found: ID does not exist" containerID="db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.264302 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab"} err="failed to get container status \"db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab\": rpc error: code = NotFound desc = could not find container \"db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab\": container with ID starting with db36cb6cfe0db4ed45344cfd103c5f084ae521c2faa291c1f59412abfd577cab not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.264319 4812 scope.go:117] "RemoveContainer" containerID="fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.264546 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc\": container with ID starting with fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc not found: ID does not exist" containerID="fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.264579 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc"} err="failed to get container status \"fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc\": rpc error: code = NotFound desc = could not find container \"fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc\": container with ID starting with fc692ddfe3509a552a0cbe34c39d944ccd35b7aca14e73cbf3ce07967e7386fc not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.264600 4812 scope.go:117] "RemoveContainer" containerID="62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.264854 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3\": container with ID starting with 62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3 not found: ID does not exist" containerID="62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.264878 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3"} err="failed to get container status \"62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3\": rpc error: code = NotFound desc = could not find container \"62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3\": container with ID starting with 62b2166fff148e49f879e4d645a47e814df5c8fdfed7e1edfd0b7411a597c0b3 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.264898 4812 scope.go:117] "RemoveContainer" containerID="3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.265156 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d\": container with ID starting with 3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d not found: ID does not exist" containerID="3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265179 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d"} err="failed to get container status \"3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d\": rpc error: code = NotFound desc = could not find container \"3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d\": container with ID starting with 3ff1f36c9a41c146af2c74653d6732df585cdf663f9b407db339109bf183129d not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265195 4812 scope.go:117] "RemoveContainer" containerID="552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.265416 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8\": container with ID starting with 552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8 not found: ID does not exist" containerID="552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265441 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8"} err="failed to get container status \"552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8\": rpc error: code = NotFound desc = could not find container \"552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8\": container with ID starting with 552485fa6a5a8b076f534282bfdedd4f5ea5544972619e67ced61c0d95a93be8 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265465 4812 scope.go:117] "RemoveContainer" containerID="46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.265680 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94\": container with ID starting with 46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94 not found: ID does not exist" containerID="46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265704 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94"} err="failed to get container status \"46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94\": rpc error: code = NotFound desc = could not find container \"46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94\": container with ID starting with 46dc1bf68955f3f6e18971328d2b50d9c431b3b203eb7545ea480f042c6a7e94 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265716 4812 scope.go:117] "RemoveContainer" containerID="7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.265946 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a\": container with ID starting with 7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a not found: ID does not exist" containerID="7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265976 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a"} err="failed to get container status \"7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a\": rpc error: code = NotFound desc = could not find container \"7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a\": container with ID starting with 7117d534a38576f697ec71f64d2f31d5932b11ffd7dd7ff8ce1cfe5a83dad67a not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.265993 4812 scope.go:117] "RemoveContainer" containerID="d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.266186 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85\": container with ID starting with d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85 not found: ID does not exist" containerID="d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.266201 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85"} err="failed to get container status \"d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85\": rpc error: code = NotFound desc = could not find container \"d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85\": container with ID starting with d818bafc1c7d139138a5a5214d2a6ac11e838a10eb304585135c07b29afd5b85 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.266216 4812 scope.go:117] "RemoveContainer" containerID="cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.266393 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0\": container with ID starting with cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0 not found: ID does not exist" containerID="cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.266413 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0"} err="failed to get container status \"cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0\": rpc error: code = NotFound desc = could not find container \"cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0\": container with ID starting with cc9cf0f941b1608928d02f37e79e7c36213ee816227ef26955dd2a348ab040f0 not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.266425 4812 scope.go:117] "RemoveContainer" containerID="6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd" Jan 31 04:52:38 crc kubenswrapper[4812]: E0131 04:52:38.266596 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd\": container with ID starting with 6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd not found: ID does not exist" containerID="6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.266611 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd"} err="failed to get container status \"6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd\": rpc error: code = NotFound desc = could not find container \"6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd\": container with ID starting with 6eec466990ac02ede68564a1422253ae6713b61dbdd93679d04292e7b9a00cdd not found: ID does not exist" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.269183 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.274516 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["glance-kuttl-tests/swift-storage-0"] Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.297919 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-gwd9t\" (UniqueName: \"kubernetes.io/projected/907bac78-f420-4fb6-93f4-20db94eee8cd-kube-api-access-gwd9t\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.297970 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-utilities\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.298025 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-catalog-content\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.298475 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-catalog-content\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.298547 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-utilities\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.319943 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-gwd9t\" (UniqueName: \"kubernetes.io/projected/907bac78-f420-4fb6-93f4-20db94eee8cd-kube-api-access-gwd9t\") pod \"community-operators-62b5p\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.347818 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="bab986d4-81ba-4f72-a3fc-3b0cdb004c6e" path="/var/lib/kubelet/pods/bab986d4-81ba-4f72-a3fc-3b0cdb004c6e/volumes" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.356414 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.797186 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-62b5p"] Jan 31 04:52:38 crc kubenswrapper[4812]: I0131 04:52:38.942420 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62b5p" event={"ID":"907bac78-f420-4fb6-93f4-20db94eee8cd","Type":"ContainerStarted","Data":"b5a4d6df45865ed862b896a90819436260968a526627b103446b066eba60292f"} Jan 31 04:52:39 crc kubenswrapper[4812]: I0131 04:52:39.954031 4812 generic.go:334] "Generic (PLEG): container finished" podID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerID="72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91" exitCode=0 Jan 31 04:52:39 crc kubenswrapper[4812]: I0131 04:52:39.954104 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62b5p" event={"ID":"907bac78-f420-4fb6-93f4-20db94eee8cd","Type":"ContainerDied","Data":"72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91"} Jan 31 04:52:40 crc kubenswrapper[4812]: I0131 04:52:40.989826 4812 generic.go:334] "Generic (PLEG): container finished" podID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerID="c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc" exitCode=0 Jan 31 04:52:40 crc kubenswrapper[4812]: I0131 04:52:40.989947 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62b5p" event={"ID":"907bac78-f420-4fb6-93f4-20db94eee8cd","Type":"ContainerDied","Data":"c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc"} Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.450209 4812 scope.go:117] "RemoveContainer" containerID="7b4f980d4fd4007dcb5809a9be93a83efcdbe26beb11208b5380add0faa92b6b" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.479492 4812 scope.go:117] "RemoveContainer" containerID="dec5444b1215453156339665e2f31518887e45ccf575b66dd88cd1f8e8c95d5d" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.530817 4812 scope.go:117] "RemoveContainer" containerID="27314d412b708c5ca257551340b90c82ea8fe6ec1679257cc9035ffd320bec1b" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.562688 4812 scope.go:117] "RemoveContainer" containerID="92e99bf11cbc3d5dca231d0ee341676584a6e73f60e8fb652b9df931a5511eef" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.604315 4812 scope.go:117] "RemoveContainer" containerID="a9a5b731319dd8729038bc05b80dcaa9557a5c7d753d1e0ed8524290308db031" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.627248 4812 scope.go:117] "RemoveContainer" containerID="e52c41cd53f076fe1c1fd277272600b69fd0fc22d5ca3dadd963169d93ded8b0" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.645195 4812 scope.go:117] "RemoveContainer" containerID="94ec430977786d84f20f896cd6ed99d48f9ab9a151000b0cf26539f8fcd1e9cf" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.677028 4812 scope.go:117] "RemoveContainer" containerID="c45a09b678314eb929c988fb9029a74d43c1da36cd2dc8ae1460e41d6e16d354" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.693177 4812 scope.go:117] "RemoveContainer" containerID="15ff3d3a4018d218ea8ff943bce08ba751225296dc5ab51db1be49868daf375d" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.731464 4812 scope.go:117] "RemoveContainer" containerID="67c8fa1c0270baf5a6ebce83741ac7c6978656eb2b375db434b740a6e59fde8b" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.750665 4812 scope.go:117] "RemoveContainer" containerID="f135e6cd7937e468ad1585a4cd7483d96a6e60dfb2f3ca521cd1d8e3d4a07d00" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.766368 4812 scope.go:117] "RemoveContainer" containerID="b698949a0a69c7dc614f1c6150a7e3959a0c7af53a41006aaa9bd24009590a46" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.788608 4812 scope.go:117] "RemoveContainer" containerID="f8a54cd567bfe9b9b5f8f1c625fb0f15433914ed7a8f1f19173dfa4b78eb08a2" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.803757 4812 scope.go:117] "RemoveContainer" containerID="c399af486fae32254458fc1ec54740d3ecf0dfdaef20ac6e6ad6148d45b2bcce" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.817378 4812 scope.go:117] "RemoveContainer" containerID="badd670fdeba9d243fe153b69dda39206396e3b07a8eb8d911b7ea393b4fd59b" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.834642 4812 scope.go:117] "RemoveContainer" containerID="2719d15c75272d57c130281742452eba890557a3cbbd31ff392413c151a763f6" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.851929 4812 scope.go:117] "RemoveContainer" containerID="38dc490856d8d7a2a51dc77ce5da33b29275b1d6d807677c6f1424d59ff987aa" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.874019 4812 scope.go:117] "RemoveContainer" containerID="274ccd42723a6b9e53535318ebc8580110e93c4297bd453f71f72ba737c3345b" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.894137 4812 scope.go:117] "RemoveContainer" containerID="948baaf4e48b4f6d1f1c630de909d1920e5f31224d8aec5922c40658b33b3f19" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.911387 4812 scope.go:117] "RemoveContainer" containerID="f4f45b62a2ea5a5922faf1d10c9506cee5231a9f26bc50e674d0f0a7f8b7dec5" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.930704 4812 scope.go:117] "RemoveContainer" containerID="3b03ecdb0a91647103c8746672aec791c70952ee73965531c1ff88b92f63261b" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.952408 4812 scope.go:117] "RemoveContainer" containerID="38e52c9c0288c0b63675bbc84f846a5149a62c293742776eaf3db9f19a358d2e" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.975210 4812 scope.go:117] "RemoveContainer" containerID="3f710dfb38b60bf2b2117b69bfc90b0c7546143908158cf1565f087060a0d5f4" Jan 31 04:52:41 crc kubenswrapper[4812]: I0131 04:52:41.990949 4812 scope.go:117] "RemoveContainer" containerID="b2ba90f9040195ee909aad60a2d2c43363e3a2c7878447cc61bd827ab1c984f0" Jan 31 04:52:42 crc kubenswrapper[4812]: I0131 04:52:42.006848 4812 scope.go:117] "RemoveContainer" containerID="206d46e9bde7ae9d382efa060b105b383b2e08bc746f62d96eb4203d8118d58f" Jan 31 04:52:42 crc kubenswrapper[4812]: I0131 04:52:42.009996 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62b5p" event={"ID":"907bac78-f420-4fb6-93f4-20db94eee8cd","Type":"ContainerStarted","Data":"c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d"} Jan 31 04:52:42 crc kubenswrapper[4812]: I0131 04:52:42.038804 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-62b5p" podStartSLOduration=3.6172945480000003 podStartE2EDuration="5.038786683s" podCreationTimestamp="2026-01-31 04:52:37 +0000 UTC" firstStartedPulling="2026-01-31 04:52:39.955573925 +0000 UTC m=+1568.450595610" lastFinishedPulling="2026-01-31 04:52:41.37706608 +0000 UTC m=+1569.872087745" observedRunningTime="2026-01-31 04:52:42.0334332 +0000 UTC m=+1570.528454895" watchObservedRunningTime="2026-01-31 04:52:42.038786683 +0000 UTC m=+1570.533808358" Jan 31 04:52:42 crc kubenswrapper[4812]: I0131 04:52:42.039075 4812 scope.go:117] "RemoveContainer" containerID="26e8194d212ca6552a12c0a8ebb634d9aea1baa5d3b43df5ea1bddef68cbde4a" Jan 31 04:52:42 crc kubenswrapper[4812]: I0131 04:52:42.057401 4812 scope.go:117] "RemoveContainer" containerID="3713b5538e0d15abbe0fdf84d12f55791eb2baa76307d6c76c24e27082f7b7ea" Jan 31 04:52:44 crc kubenswrapper[4812]: I0131 04:52:44.338488 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 04:52:44 crc kubenswrapper[4812]: I0131 04:52:44.338558 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 04:52:44 crc kubenswrapper[4812]: I0131 04:52:44.349327 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 04:52:44 crc kubenswrapper[4812]: I0131 04:52:44.350354 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 04:52:44 crc kubenswrapper[4812]: I0131 04:52:44.350585 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" gracePeriod=600 Jan 31 04:52:44 crc kubenswrapper[4812]: E0131 04:52:44.471515 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:52:45 crc kubenswrapper[4812]: I0131 04:52:45.064024 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" exitCode=0 Jan 31 04:52:45 crc kubenswrapper[4812]: I0131 04:52:45.064067 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3"} Jan 31 04:52:45 crc kubenswrapper[4812]: I0131 04:52:45.064139 4812 scope.go:117] "RemoveContainer" containerID="b6547cb6e9072bd2109b078699f64952c12bea692f52442e858dc4ce59b6b718" Jan 31 04:52:45 crc kubenswrapper[4812]: I0131 04:52:45.064965 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:52:45 crc kubenswrapper[4812]: E0131 04:52:45.065399 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.191874 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5mhxf/must-gather-5sdsp"] Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.194110 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.209097 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mhxf/must-gather-5sdsp"] Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.209131 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5mhxf"/"kube-root-ca.crt" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.209242 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-5mhxf"/"openshift-service-ca.crt" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.321390 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bv78l\" (UniqueName: \"kubernetes.io/projected/ed73d7df-99ea-46db-8806-29efceb440f8-kube-api-access-bv78l\") pod \"must-gather-5sdsp\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.321484 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed73d7df-99ea-46db-8806-29efceb440f8-must-gather-output\") pod \"must-gather-5sdsp\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.424272 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-bv78l\" (UniqueName: \"kubernetes.io/projected/ed73d7df-99ea-46db-8806-29efceb440f8-kube-api-access-bv78l\") pod \"must-gather-5sdsp\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.424310 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed73d7df-99ea-46db-8806-29efceb440f8-must-gather-output\") pod \"must-gather-5sdsp\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.424763 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed73d7df-99ea-46db-8806-29efceb440f8-must-gather-output\") pod \"must-gather-5sdsp\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.443097 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-bv78l\" (UniqueName: \"kubernetes.io/projected/ed73d7df-99ea-46db-8806-29efceb440f8-kube-api-access-bv78l\") pod \"must-gather-5sdsp\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.513339 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:52:47 crc kubenswrapper[4812]: I0131 04:52:47.914155 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5mhxf/must-gather-5sdsp"] Jan 31 04:52:48 crc kubenswrapper[4812]: I0131 04:52:48.101577 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" event={"ID":"ed73d7df-99ea-46db-8806-29efceb440f8","Type":"ContainerStarted","Data":"b3b15febb5534e2b98d6aba7fdf0db071ea9651578afa9894346d1f880a4ae0f"} Jan 31 04:52:48 crc kubenswrapper[4812]: I0131 04:52:48.357093 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:48 crc kubenswrapper[4812]: I0131 04:52:48.357148 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:48 crc kubenswrapper[4812]: I0131 04:52:48.409401 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:49 crc kubenswrapper[4812]: I0131 04:52:49.149570 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:50 crc kubenswrapper[4812]: I0131 04:52:50.789566 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-62b5p"] Jan 31 04:52:51 crc kubenswrapper[4812]: I0131 04:52:51.128660 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-62b5p" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="registry-server" containerID="cri-o://c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d" gracePeriod=2 Jan 31 04:52:51 crc kubenswrapper[4812]: I0131 04:52:51.989977 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.080868 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-utilities\") pod \"907bac78-f420-4fb6-93f4-20db94eee8cd\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.080989 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-catalog-content\") pod \"907bac78-f420-4fb6-93f4-20db94eee8cd\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.081063 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gwd9t\" (UniqueName: \"kubernetes.io/projected/907bac78-f420-4fb6-93f4-20db94eee8cd-kube-api-access-gwd9t\") pod \"907bac78-f420-4fb6-93f4-20db94eee8cd\" (UID: \"907bac78-f420-4fb6-93f4-20db94eee8cd\") " Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.082105 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-utilities" (OuterVolumeSpecName: "utilities") pod "907bac78-f420-4fb6-93f4-20db94eee8cd" (UID: "907bac78-f420-4fb6-93f4-20db94eee8cd"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.096479 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/907bac78-f420-4fb6-93f4-20db94eee8cd-kube-api-access-gwd9t" (OuterVolumeSpecName: "kube-api-access-gwd9t") pod "907bac78-f420-4fb6-93f4-20db94eee8cd" (UID: "907bac78-f420-4fb6-93f4-20db94eee8cd"). InnerVolumeSpecName "kube-api-access-gwd9t". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.133531 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "907bac78-f420-4fb6-93f4-20db94eee8cd" (UID: "907bac78-f420-4fb6-93f4-20db94eee8cd"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.136394 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" event={"ID":"ed73d7df-99ea-46db-8806-29efceb440f8","Type":"ContainerStarted","Data":"57326236ed7e70c10bc8e3140f562eb36486de90dcc0d796b4796fe81e2a3caa"} Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.139928 4812 generic.go:334] "Generic (PLEG): container finished" podID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerID="c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d" exitCode=0 Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.139967 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62b5p" event={"ID":"907bac78-f420-4fb6-93f4-20db94eee8cd","Type":"ContainerDied","Data":"c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d"} Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.139993 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-62b5p" event={"ID":"907bac78-f420-4fb6-93f4-20db94eee8cd","Type":"ContainerDied","Data":"b5a4d6df45865ed862b896a90819436260968a526627b103446b066eba60292f"} Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.140008 4812 scope.go:117] "RemoveContainer" containerID="c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.140120 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-62b5p" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.182387 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-62b5p"] Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.182595 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.182620 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-gwd9t\" (UniqueName: \"kubernetes.io/projected/907bac78-f420-4fb6-93f4-20db94eee8cd-kube-api-access-gwd9t\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.182646 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/907bac78-f420-4fb6-93f4-20db94eee8cd-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.185743 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-62b5p"] Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.187635 4812 scope.go:117] "RemoveContainer" containerID="c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.208571 4812 scope.go:117] "RemoveContainer" containerID="72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.228097 4812 scope.go:117] "RemoveContainer" containerID="c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d" Jan 31 04:52:52 crc kubenswrapper[4812]: E0131 04:52:52.228729 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d\": container with ID starting with c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d not found: ID does not exist" containerID="c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.228772 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d"} err="failed to get container status \"c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d\": rpc error: code = NotFound desc = could not find container \"c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d\": container with ID starting with c54975b5fc092c5080aa39960b7dc9d24b0e6b90e9c5d467334e1c4a9d00c32d not found: ID does not exist" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.228803 4812 scope.go:117] "RemoveContainer" containerID="c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc" Jan 31 04:52:52 crc kubenswrapper[4812]: E0131 04:52:52.229155 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc\": container with ID starting with c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc not found: ID does not exist" containerID="c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.229180 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc"} err="failed to get container status \"c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc\": rpc error: code = NotFound desc = could not find container \"c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc\": container with ID starting with c28cbde019f748ecd71c0b9c0990f4fcae41fdc98982816e3834bd67a602e1cc not found: ID does not exist" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.229197 4812 scope.go:117] "RemoveContainer" containerID="72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91" Jan 31 04:52:52 crc kubenswrapper[4812]: E0131 04:52:52.229445 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91\": container with ID starting with 72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91 not found: ID does not exist" containerID="72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.229485 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91"} err="failed to get container status \"72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91\": rpc error: code = NotFound desc = could not find container \"72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91\": container with ID starting with 72b155e23a3271086e1b2ab30085f5a2e5844cef6fd034aba7e6bc97824e1f91 not found: ID does not exist" Jan 31 04:52:52 crc kubenswrapper[4812]: I0131 04:52:52.349328 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" path="/var/lib/kubelet/pods/907bac78-f420-4fb6-93f4-20db94eee8cd/volumes" Jan 31 04:52:53 crc kubenswrapper[4812]: I0131 04:52:53.150362 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" event={"ID":"ed73d7df-99ea-46db-8806-29efceb440f8","Type":"ContainerStarted","Data":"50073db6230e3e606599f15666914043f9fb71b9af0f9767eef14da6eb0b57f0"} Jan 31 04:52:56 crc kubenswrapper[4812]: I0131 04:52:56.339007 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:52:56 crc kubenswrapper[4812]: E0131 04:52:56.339521 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.405703 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.408055 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:01.908026207 +0000 UTC m=+1590.403047912 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.406943 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.408406 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:01.908389647 +0000 UTC m=+1590.403411342 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.912857 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.912885 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.912925 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:02.912909329 +0000 UTC m=+1591.407930994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:01 crc kubenswrapper[4812]: E0131 04:53:01.913019 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:02.912982371 +0000 UTC m=+1591.408004076 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:02 crc kubenswrapper[4812]: E0131 04:53:02.926209 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:02 crc kubenswrapper[4812]: E0131 04:53:02.926256 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:02 crc kubenswrapper[4812]: E0131 04:53:02.926306 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:04.926287919 +0000 UTC m=+1593.421309584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:02 crc kubenswrapper[4812]: E0131 04:53:02.926326 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:04.92631788 +0000 UTC m=+1593.421339685 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:04 crc kubenswrapper[4812]: E0131 04:53:04.952218 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:04 crc kubenswrapper[4812]: E0131 04:53:04.952313 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:04 crc kubenswrapper[4812]: E0131 04:53:04.952591 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:08.952576656 +0000 UTC m=+1597.447598321 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:04 crc kubenswrapper[4812]: E0131 04:53:04.952737 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:08.952707819 +0000 UTC m=+1597.447729524 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:07 crc kubenswrapper[4812]: I0131 04:53:07.340438 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:53:07 crc kubenswrapper[4812]: E0131 04:53:07.341105 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:53:09 crc kubenswrapper[4812]: E0131 04:53:09.008702 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:09 crc kubenswrapper[4812]: E0131 04:53:09.008775 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:17.008761907 +0000 UTC m=+1605.503783572 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:09 crc kubenswrapper[4812]: E0131 04:53:09.008791 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:09 crc kubenswrapper[4812]: E0131 04:53:09.008916 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:17.008889961 +0000 UTC m=+1605.503911646 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:17 crc kubenswrapper[4812]: E0131 04:53:17.021924 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:17 crc kubenswrapper[4812]: E0131 04:53:17.022469 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:33.022450212 +0000 UTC m=+1621.517471877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:17 crc kubenswrapper[4812]: E0131 04:53:17.021949 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:17 crc kubenswrapper[4812]: E0131 04:53:17.022595 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:53:33.022571775 +0000 UTC m=+1621.517593510 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:19 crc kubenswrapper[4812]: I0131 04:53:19.340188 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:53:19 crc kubenswrapper[4812]: E0131 04:53:19.340681 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.304763 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/util/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.505800 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/util/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.524844 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/pull/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.552734 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/pull/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.654327 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/util/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.673903 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/extract/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.675553 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/pull/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.832782 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d479c78f-kf6sb_500efc8e-b639-4788-833f-3cb9189e1009/manager/0.log" Jan 31 04:53:30 crc kubenswrapper[4812]: I0131 04:53:30.855585 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-dggq6_bdfad5bb-7984-4248-9598-0319bb4543e0/registry-server/0.log" Jan 31 04:53:33 crc kubenswrapper[4812]: E0131 04:53:33.056056 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:53:33 crc kubenswrapper[4812]: E0131 04:53:33.056090 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:53:33 crc kubenswrapper[4812]: E0131 04:53:33.056453 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:54:05.056430299 +0000 UTC m=+1653.551451984 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:53:33 crc kubenswrapper[4812]: E0131 04:53:33.056474 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:54:05.0564634 +0000 UTC m=+1653.551485065 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:53:33 crc kubenswrapper[4812]: I0131 04:53:33.339738 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:53:33 crc kubenswrapper[4812]: E0131 04:53:33.340176 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.438374 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" podStartSLOduration=46.589573339 podStartE2EDuration="50.438349458s" podCreationTimestamp="2026-01-31 04:52:47 +0000 UTC" firstStartedPulling="2026-01-31 04:52:47.925161451 +0000 UTC m=+1576.420183116" lastFinishedPulling="2026-01-31 04:52:51.77393755 +0000 UTC m=+1580.268959235" observedRunningTime="2026-01-31 04:52:53.169153924 +0000 UTC m=+1581.664175589" watchObservedRunningTime="2026-01-31 04:53:37.438349458 +0000 UTC m=+1625.933371163" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.439967 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-marketplace-kcxk4"] Jan 31 04:53:37 crc kubenswrapper[4812]: E0131 04:53:37.440275 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="extract-utilities" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.440303 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="extract-utilities" Jan 31 04:53:37 crc kubenswrapper[4812]: E0131 04:53:37.440345 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="extract-content" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.440360 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="extract-content" Jan 31 04:53:37 crc kubenswrapper[4812]: E0131 04:53:37.440387 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="registry-server" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.440401 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="registry-server" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.440611 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="907bac78-f420-4fb6-93f4-20db94eee8cd" containerName="registry-server" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.442152 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.454697 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcxk4"] Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.624222 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-utilities\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.624582 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pt2s2\" (UniqueName: \"kubernetes.io/projected/23007872-a429-42af-a5ce-72eff51d7d83-kube-api-access-pt2s2\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.624626 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-catalog-content\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.726263 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-utilities\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.726315 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-pt2s2\" (UniqueName: \"kubernetes.io/projected/23007872-a429-42af-a5ce-72eff51d7d83-kube-api-access-pt2s2\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.726347 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-catalog-content\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.726926 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-catalog-content\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.726987 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-utilities\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.752123 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-pt2s2\" (UniqueName: \"kubernetes.io/projected/23007872-a429-42af-a5ce-72eff51d7d83-kube-api-access-pt2s2\") pod \"redhat-marketplace-kcxk4\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:37 crc kubenswrapper[4812]: I0131 04:53:37.792708 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:38 crc kubenswrapper[4812]: I0131 04:53:38.234594 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcxk4"] Jan 31 04:53:38 crc kubenswrapper[4812]: I0131 04:53:38.469180 4812 generic.go:334] "Generic (PLEG): container finished" podID="23007872-a429-42af-a5ce-72eff51d7d83" containerID="ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e" exitCode=0 Jan 31 04:53:38 crc kubenswrapper[4812]: I0131 04:53:38.469277 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcxk4" event={"ID":"23007872-a429-42af-a5ce-72eff51d7d83","Type":"ContainerDied","Data":"ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e"} Jan 31 04:53:38 crc kubenswrapper[4812]: I0131 04:53:38.469617 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcxk4" event={"ID":"23007872-a429-42af-a5ce-72eff51d7d83","Type":"ContainerStarted","Data":"4dceaf896c13c32c826d4d12d98cd8541a61a10007396b8dc3fa8a5698d901f2"} Jan 31 04:53:39 crc kubenswrapper[4812]: I0131 04:53:39.479041 4812 generic.go:334] "Generic (PLEG): container finished" podID="23007872-a429-42af-a5ce-72eff51d7d83" containerID="19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386" exitCode=0 Jan 31 04:53:39 crc kubenswrapper[4812]: I0131 04:53:39.479125 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcxk4" event={"ID":"23007872-a429-42af-a5ce-72eff51d7d83","Type":"ContainerDied","Data":"19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386"} Jan 31 04:53:40 crc kubenswrapper[4812]: I0131 04:53:40.488777 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcxk4" event={"ID":"23007872-a429-42af-a5ce-72eff51d7d83","Type":"ContainerStarted","Data":"5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc"} Jan 31 04:53:40 crc kubenswrapper[4812]: I0131 04:53:40.510570 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-marketplace-kcxk4" podStartSLOduration=2.105339056 podStartE2EDuration="3.510545436s" podCreationTimestamp="2026-01-31 04:53:37 +0000 UTC" firstStartedPulling="2026-01-31 04:53:38.471492276 +0000 UTC m=+1626.966513971" lastFinishedPulling="2026-01-31 04:53:39.876698686 +0000 UTC m=+1628.371720351" observedRunningTime="2026-01-31 04:53:40.505275955 +0000 UTC m=+1629.000297660" watchObservedRunningTime="2026-01-31 04:53:40.510545436 +0000 UTC m=+1629.005567121" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.474729 4812 scope.go:117] "RemoveContainer" containerID="0e5337d2ce3736a25cf57e4b0ff878deae238372e13026ef171c21b721a2b7da" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.508167 4812 scope.go:117] "RemoveContainer" containerID="1912eb3f1bd65883a6d6c9741bcf7a2a5180f155c29d261cb16b93591689948e" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.546032 4812 scope.go:117] "RemoveContainer" containerID="e51ab9b26999f0955f93725960efb8199b1fee513f942f95303d4655c737afac" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.572628 4812 scope.go:117] "RemoveContainer" containerID="c4a6184a03c27e1e3532aad1acb22e775b25f84064b83caa079ffd87496ab4a5" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.596446 4812 scope.go:117] "RemoveContainer" containerID="6d8832b0bc6414e5e64d767b2c3eff38c263d5767886a75c27a33f8f6913b33d" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.621857 4812 scope.go:117] "RemoveContainer" containerID="be533b808bb3e368e8424c4813d13d7c98276f784fdfd2562862634ebbb30094" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.672014 4812 scope.go:117] "RemoveContainer" containerID="877083d9b518409a463edb666d633e4c7ccd9f471cd9f50026d9061a729273c0" Jan 31 04:53:42 crc kubenswrapper[4812]: I0131 04:53:42.685864 4812 scope.go:117] "RemoveContainer" containerID="64ef56966501bb3212494535586a8d5491def695b937c425a1a878459582fdc7" Jan 31 04:53:44 crc kubenswrapper[4812]: I0131 04:53:44.358867 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qlf2j_e460d967-199b-41b2-a198-3acaaa1f4382/control-plane-machine-set-operator/0.log" Jan 31 04:53:44 crc kubenswrapper[4812]: I0131 04:53:44.517000 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rm7wz_e5c2893c-e678-4c5e-8692-8d50c2510ded/kube-rbac-proxy/0.log" Jan 31 04:53:44 crc kubenswrapper[4812]: I0131 04:53:44.577697 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rm7wz_e5c2893c-e678-4c5e-8692-8d50c2510ded/machine-api-operator/0.log" Jan 31 04:53:45 crc kubenswrapper[4812]: I0131 04:53:45.339676 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:53:45 crc kubenswrapper[4812]: E0131 04:53:45.340106 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:53:47 crc kubenswrapper[4812]: I0131 04:53:47.793494 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:47 crc kubenswrapper[4812]: I0131 04:53:47.793861 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:47 crc kubenswrapper[4812]: I0131 04:53:47.858702 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:48 crc kubenswrapper[4812]: I0131 04:53:48.587878 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:48 crc kubenswrapper[4812]: I0131 04:53:48.629639 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcxk4"] Jan 31 04:53:50 crc kubenswrapper[4812]: I0131 04:53:50.546616 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-marketplace-kcxk4" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="registry-server" containerID="cri-o://5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc" gracePeriod=2 Jan 31 04:53:50 crc kubenswrapper[4812]: I0131 04:53:50.964745 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.119660 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-catalog-content\") pod \"23007872-a429-42af-a5ce-72eff51d7d83\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.119719 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pt2s2\" (UniqueName: \"kubernetes.io/projected/23007872-a429-42af-a5ce-72eff51d7d83-kube-api-access-pt2s2\") pod \"23007872-a429-42af-a5ce-72eff51d7d83\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.119765 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-utilities\") pod \"23007872-a429-42af-a5ce-72eff51d7d83\" (UID: \"23007872-a429-42af-a5ce-72eff51d7d83\") " Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.120854 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-utilities" (OuterVolumeSpecName: "utilities") pod "23007872-a429-42af-a5ce-72eff51d7d83" (UID: "23007872-a429-42af-a5ce-72eff51d7d83"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.133002 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23007872-a429-42af-a5ce-72eff51d7d83-kube-api-access-pt2s2" (OuterVolumeSpecName: "kube-api-access-pt2s2") pod "23007872-a429-42af-a5ce-72eff51d7d83" (UID: "23007872-a429-42af-a5ce-72eff51d7d83"). InnerVolumeSpecName "kube-api-access-pt2s2". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.176225 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "23007872-a429-42af-a5ce-72eff51d7d83" (UID: "23007872-a429-42af-a5ce-72eff51d7d83"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.221379 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.221414 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/23007872-a429-42af-a5ce-72eff51d7d83-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.221427 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-pt2s2\" (UniqueName: \"kubernetes.io/projected/23007872-a429-42af-a5ce-72eff51d7d83-kube-api-access-pt2s2\") on node \"crc\" DevicePath \"\"" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.555164 4812 generic.go:334] "Generic (PLEG): container finished" podID="23007872-a429-42af-a5ce-72eff51d7d83" containerID="5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc" exitCode=0 Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.555232 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-marketplace-kcxk4" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.555267 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcxk4" event={"ID":"23007872-a429-42af-a5ce-72eff51d7d83","Type":"ContainerDied","Data":"5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc"} Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.556418 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-marketplace-kcxk4" event={"ID":"23007872-a429-42af-a5ce-72eff51d7d83","Type":"ContainerDied","Data":"4dceaf896c13c32c826d4d12d98cd8541a61a10007396b8dc3fa8a5698d901f2"} Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.556440 4812 scope.go:117] "RemoveContainer" containerID="5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.580802 4812 scope.go:117] "RemoveContainer" containerID="19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.589348 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcxk4"] Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.598486 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-marketplace-kcxk4"] Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.622526 4812 scope.go:117] "RemoveContainer" containerID="ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.652071 4812 scope.go:117] "RemoveContainer" containerID="5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc" Jan 31 04:53:51 crc kubenswrapper[4812]: E0131 04:53:51.652485 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc\": container with ID starting with 5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc not found: ID does not exist" containerID="5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.652510 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc"} err="failed to get container status \"5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc\": rpc error: code = NotFound desc = could not find container \"5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc\": container with ID starting with 5b643d9904ab7b4d95300d4bcfbb32e878c0b333f8c7fdd6eb37da857494b2fc not found: ID does not exist" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.652530 4812 scope.go:117] "RemoveContainer" containerID="19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386" Jan 31 04:53:51 crc kubenswrapper[4812]: E0131 04:53:51.652854 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386\": container with ID starting with 19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386 not found: ID does not exist" containerID="19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.652874 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386"} err="failed to get container status \"19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386\": rpc error: code = NotFound desc = could not find container \"19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386\": container with ID starting with 19687995bdae4238a91b52540b4066dbc7218596331666fbd2d80da56cd12386 not found: ID does not exist" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.652885 4812 scope.go:117] "RemoveContainer" containerID="ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e" Jan 31 04:53:51 crc kubenswrapper[4812]: E0131 04:53:51.653164 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e\": container with ID starting with ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e not found: ID does not exist" containerID="ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e" Jan 31 04:53:51 crc kubenswrapper[4812]: I0131 04:53:51.653184 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e"} err="failed to get container status \"ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e\": rpc error: code = NotFound desc = could not find container \"ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e\": container with ID starting with ce18b0d306a2114465568fd5a5ea89ea44229b6372219b63bb93561dde78dd2e not found: ID does not exist" Jan 31 04:53:52 crc kubenswrapper[4812]: I0131 04:53:52.346062 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23007872-a429-42af-a5ce-72eff51d7d83" path="/var/lib/kubelet/pods/23007872-a429-42af-a5ce-72eff51d7d83/volumes" Jan 31 04:53:56 crc kubenswrapper[4812]: I0131 04:53:56.339666 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:53:56 crc kubenswrapper[4812]: E0131 04:53:56.340156 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:54:05 crc kubenswrapper[4812]: E0131 04:54:05.114576 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:54:05 crc kubenswrapper[4812]: E0131 04:54:05.115127 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:55:09.115110552 +0000 UTC m=+1717.610132207 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:54:05 crc kubenswrapper[4812]: E0131 04:54:05.114603 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:54:05 crc kubenswrapper[4812]: E0131 04:54:05.115238 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:55:09.115220425 +0000 UTC m=+1717.610242090 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:54:07 crc kubenswrapper[4812]: I0131 04:54:07.340092 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:54:07 crc kubenswrapper[4812]: E0131 04:54:07.340473 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:54:13 crc kubenswrapper[4812]: I0131 04:54:13.912727 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-hldjr_b7380313-059d-4437-a3dc-371ce0a51fc3/kube-rbac-proxy/0.log" Jan 31 04:54:13 crc kubenswrapper[4812]: I0131 04:54:13.957314 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-hldjr_b7380313-059d-4437-a3dc-371ce0a51fc3/controller/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.059600 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.269313 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.292381 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.306624 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.324010 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.473935 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.474732 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.528427 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.584159 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.693591 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.715087 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.723301 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.784486 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/controller/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.921343 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/kube-rbac-proxy/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.935865 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/frr-metrics/0.log" Jan 31 04:54:14 crc kubenswrapper[4812]: I0131 04:54:14.950438 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/kube-rbac-proxy-frr/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.122307 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-kctff_c8c1c067-45c3-4fc8-b656-920d058691ee/frr-k8s-webhook-server/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.135099 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/reloader/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.325424 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-8458b697d8-qwvb2_3e6043c0-61f1-4cb1-a1df-056d81c22ea0/manager/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.502345 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/frr/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.523678 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56b4c6c8df-wjtg6_dfe9ef2d-1841-414a-9645-84d15e3fa9e5/webhook-server/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.556861 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pnrw7_b0653050-2c1a-48f2-9f1b-10ccd0366143/kube-rbac-proxy/0.log" Jan 31 04:54:15 crc kubenswrapper[4812]: I0131 04:54:15.808455 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pnrw7_b0653050-2c1a-48f2-9f1b-10ccd0366143/speaker/0.log" Jan 31 04:54:19 crc kubenswrapper[4812]: I0131 04:54:19.340388 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:54:19 crc kubenswrapper[4812]: E0131 04:54:19.341138 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:54:27 crc kubenswrapper[4812]: I0131 04:54:27.845046 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62/openstackclient/0.log" Jan 31 04:54:30 crc kubenswrapper[4812]: I0131 04:54:30.339293 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:54:30 crc kubenswrapper[4812]: E0131 04:54:30.339927 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.075366 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/util/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.186218 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/util/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.216325 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/pull/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.262972 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/pull/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.424522 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/pull/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.426080 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/util/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.445146 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/extract/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.608896 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-utilities/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.742177 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-content/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.748157 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-utilities/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.793380 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-content/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.971501 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-content/0.log" Jan 31 04:54:41 crc kubenswrapper[4812]: I0131 04:54:41.979889 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-utilities/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.144412 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-utilities/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.342187 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:54:42 crc kubenswrapper[4812]: E0131 04:54:42.342389 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.355745 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/registry-server/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.382418 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-content/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.413874 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-utilities/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.414602 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-content/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.583976 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-content/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.603802 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-utilities/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.813728 4812 scope.go:117] "RemoveContainer" containerID="cd2b227df70adcd77a1fd00609f84655ff153e429369e1133b0221e400f9f704" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.847962 4812 scope.go:117] "RemoveContainer" containerID="4b377c58a34afd1cf83cbcda6b639f017fa79cedaa3ca68b6e33353d35e8487a" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.859272 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rcfw5_55175f00-9682-4c72-a26a-3b050c99af46/marketplace-operator/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.882417 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-utilities/0.log" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.896005 4812 scope.go:117] "RemoveContainer" containerID="3cec6b79468f1dc473449d009be78ed472173e623493772d3e2578cb84aa1eb6" Jan 31 04:54:42 crc kubenswrapper[4812]: I0131 04:54:42.926770 4812 scope.go:117] "RemoveContainer" containerID="25b9311210eee8271b5ac70f79a7c1f9add1137009ef0612cf6443e0f3b7dd00" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.107122 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/registry-server/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.127561 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-utilities/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.131348 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-content/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.135887 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-content/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.268872 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-content/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.323879 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-utilities/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.351308 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/registry-server/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.427529 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-utilities/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.607699 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-content/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.634789 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-content/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.673154 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-utilities/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.799672 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-content/0.log" Jan 31 04:54:43 crc kubenswrapper[4812]: I0131 04:54:43.804681 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-utilities/0.log" Jan 31 04:54:44 crc kubenswrapper[4812]: I0131 04:54:44.077794 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/registry-server/0.log" Jan 31 04:54:53 crc kubenswrapper[4812]: I0131 04:54:53.339522 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:54:53 crc kubenswrapper[4812]: E0131 04:54:53.340536 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:55:05 crc kubenswrapper[4812]: I0131 04:55:05.734101 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:55:05 crc kubenswrapper[4812]: E0131 04:55:05.735068 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:55:09 crc kubenswrapper[4812]: E0131 04:55:09.184261 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:55:09 crc kubenswrapper[4812]: E0131 04:55:09.184291 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:55:09 crc kubenswrapper[4812]: E0131 04:55:09.184705 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:57:11.184683822 +0000 UTC m=+1839.679705487 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:55:09 crc kubenswrapper[4812]: E0131 04:55:09.184726 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:57:11.184714963 +0000 UTC m=+1839.679736628 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:55:16 crc kubenswrapper[4812]: I0131 04:55:16.339697 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:55:16 crc kubenswrapper[4812]: E0131 04:55:16.340888 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:55:31 crc kubenswrapper[4812]: I0131 04:55:31.339495 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:55:31 crc kubenswrapper[4812]: E0131 04:55:31.340829 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:55:43 crc kubenswrapper[4812]: I0131 04:55:43.002192 4812 scope.go:117] "RemoveContainer" containerID="f75841c41655227dc054cbea93b88f07552339c5ec27980edfef4e564e71f7f6" Jan 31 04:55:43 crc kubenswrapper[4812]: I0131 04:55:43.043947 4812 scope.go:117] "RemoveContainer" containerID="af36f8e080623bebcc8e00506e8941ed5df7ae530bc826a550ed41757d9aa46d" Jan 31 04:55:43 crc kubenswrapper[4812]: I0131 04:55:43.094023 4812 scope.go:117] "RemoveContainer" containerID="366a19809bf79181e5960492acbe1cca1fef0f71242f55b20fe6c97a3855e15e" Jan 31 04:55:43 crc kubenswrapper[4812]: I0131 04:55:43.108448 4812 scope.go:117] "RemoveContainer" containerID="dadee4a8340e3b7724a378c5015d9216f0e59911788a87794ccfb28deee3ec3c" Jan 31 04:55:46 crc kubenswrapper[4812]: I0131 04:55:46.339781 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:55:46 crc kubenswrapper[4812]: E0131 04:55:46.340352 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:55:55 crc kubenswrapper[4812]: I0131 04:55:55.452403 4812 generic.go:334] "Generic (PLEG): container finished" podID="ed73d7df-99ea-46db-8806-29efceb440f8" containerID="57326236ed7e70c10bc8e3140f562eb36486de90dcc0d796b4796fe81e2a3caa" exitCode=0 Jan 31 04:55:55 crc kubenswrapper[4812]: I0131 04:55:55.452659 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" event={"ID":"ed73d7df-99ea-46db-8806-29efceb440f8","Type":"ContainerDied","Data":"57326236ed7e70c10bc8e3140f562eb36486de90dcc0d796b4796fe81e2a3caa"} Jan 31 04:55:55 crc kubenswrapper[4812]: I0131 04:55:55.453524 4812 scope.go:117] "RemoveContainer" containerID="57326236ed7e70c10bc8e3140f562eb36486de90dcc0d796b4796fe81e2a3caa" Jan 31 04:55:55 crc kubenswrapper[4812]: I0131 04:55:55.582067 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mhxf_must-gather-5sdsp_ed73d7df-99ea-46db-8806-29efceb440f8/gather/0.log" Jan 31 04:56:00 crc kubenswrapper[4812]: I0131 04:56:00.342426 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:56:00 crc kubenswrapper[4812]: E0131 04:56:00.343344 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.334172 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5mhxf/must-gather-5sdsp"] Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.334988 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="copy" containerID="cri-o://50073db6230e3e606599f15666914043f9fb71b9af0f9767eef14da6eb0b57f0" gracePeriod=2 Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.338038 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5mhxf/must-gather-5sdsp"] Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.496497 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mhxf_must-gather-5sdsp_ed73d7df-99ea-46db-8806-29efceb440f8/copy/0.log" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.497167 4812 generic.go:334] "Generic (PLEG): container finished" podID="ed73d7df-99ea-46db-8806-29efceb440f8" containerID="50073db6230e3e606599f15666914043f9fb71b9af0f9767eef14da6eb0b57f0" exitCode=143 Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.650050 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mhxf_must-gather-5sdsp_ed73d7df-99ea-46db-8806-29efceb440f8/copy/0.log" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.650644 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.802796 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed73d7df-99ea-46db-8806-29efceb440f8-must-gather-output\") pod \"ed73d7df-99ea-46db-8806-29efceb440f8\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.802873 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bv78l\" (UniqueName: \"kubernetes.io/projected/ed73d7df-99ea-46db-8806-29efceb440f8-kube-api-access-bv78l\") pod \"ed73d7df-99ea-46db-8806-29efceb440f8\" (UID: \"ed73d7df-99ea-46db-8806-29efceb440f8\") " Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.807764 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ed73d7df-99ea-46db-8806-29efceb440f8-kube-api-access-bv78l" (OuterVolumeSpecName: "kube-api-access-bv78l") pod "ed73d7df-99ea-46db-8806-29efceb440f8" (UID: "ed73d7df-99ea-46db-8806-29efceb440f8"). InnerVolumeSpecName "kube-api-access-bv78l". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.867482 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ed73d7df-99ea-46db-8806-29efceb440f8-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "ed73d7df-99ea-46db-8806-29efceb440f8" (UID: "ed73d7df-99ea-46db-8806-29efceb440f8"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.904855 4812 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/ed73d7df-99ea-46db-8806-29efceb440f8-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 04:56:02 crc kubenswrapper[4812]: I0131 04:56:02.904888 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-bv78l\" (UniqueName: \"kubernetes.io/projected/ed73d7df-99ea-46db-8806-29efceb440f8-kube-api-access-bv78l\") on node \"crc\" DevicePath \"\"" Jan 31 04:56:03 crc kubenswrapper[4812]: I0131 04:56:03.504050 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5mhxf_must-gather-5sdsp_ed73d7df-99ea-46db-8806-29efceb440f8/copy/0.log" Jan 31 04:56:03 crc kubenswrapper[4812]: I0131 04:56:03.505146 4812 scope.go:117] "RemoveContainer" containerID="50073db6230e3e606599f15666914043f9fb71b9af0f9767eef14da6eb0b57f0" Jan 31 04:56:03 crc kubenswrapper[4812]: I0131 04:56:03.505343 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5mhxf/must-gather-5sdsp" Jan 31 04:56:03 crc kubenswrapper[4812]: I0131 04:56:03.524876 4812 scope.go:117] "RemoveContainer" containerID="57326236ed7e70c10bc8e3140f562eb36486de90dcc0d796b4796fe81e2a3caa" Jan 31 04:56:04 crc kubenswrapper[4812]: I0131 04:56:04.349657 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" path="/var/lib/kubelet/pods/ed73d7df-99ea-46db-8806-29efceb440f8/volumes" Jan 31 04:56:14 crc kubenswrapper[4812]: I0131 04:56:14.340240 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:56:14 crc kubenswrapper[4812]: E0131 04:56:14.340903 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:56:25 crc kubenswrapper[4812]: I0131 04:56:25.340394 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:56:25 crc kubenswrapper[4812]: E0131 04:56:25.341477 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:56:39 crc kubenswrapper[4812]: I0131 04:56:39.340807 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:56:39 crc kubenswrapper[4812]: E0131 04:56:39.341631 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:56:43 crc kubenswrapper[4812]: I0131 04:56:43.182603 4812 scope.go:117] "RemoveContainer" containerID="344fa82c4dd6d056f4281c708cb818888da155a569fa80ba48c3d91457f3dc40" Jan 31 04:56:43 crc kubenswrapper[4812]: I0131 04:56:43.257933 4812 scope.go:117] "RemoveContainer" containerID="ddf4600fa4e9f3d01c4770936f8efa571970639e0ef06ce4d63665ade579e58d" Jan 31 04:56:43 crc kubenswrapper[4812]: I0131 04:56:43.303023 4812 scope.go:117] "RemoveContainer" containerID="8901e46123fe46ba1433e007ba32f598219da20e58162e7df92c8bb695ec658f" Jan 31 04:56:43 crc kubenswrapper[4812]: I0131 04:56:43.325020 4812 scope.go:117] "RemoveContainer" containerID="8019826c3c57b77db5b466c6e52567e140ee905c9b8a713f60230f5f269213f7" Jan 31 04:56:52 crc kubenswrapper[4812]: I0131 04:56:52.345699 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:56:52 crc kubenswrapper[4812]: E0131 04:56:52.346762 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:57:07 crc kubenswrapper[4812]: I0131 04:57:07.339705 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:57:07 crc kubenswrapper[4812]: E0131 04:57:07.341127 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:57:11 crc kubenswrapper[4812]: E0131 04:57:11.282896 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:57:11 crc kubenswrapper[4812]: E0131 04:57:11.283434 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:59:13.283399268 +0000 UTC m=+1961.778420983 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:57:11 crc kubenswrapper[4812]: E0131 04:57:11.282940 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:57:11 crc kubenswrapper[4812]: E0131 04:57:11.283577 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 04:59:13.283544361 +0000 UTC m=+1961.778566086 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:57:21 crc kubenswrapper[4812]: I0131 04:57:21.339778 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:57:21 crc kubenswrapper[4812]: E0131 04:57:21.340509 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:57:34 crc kubenswrapper[4812]: I0131 04:57:34.340424 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:57:34 crc kubenswrapper[4812]: E0131 04:57:34.341268 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 04:57:49 crc kubenswrapper[4812]: I0131 04:57:49.339522 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 04:57:50 crc kubenswrapper[4812]: I0131 04:57:50.328614 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"b5e1e557565ae6e023f06823ab0a0b4fc2a87a2bc9f18d7758a155c3082c87ee"} Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.320346 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-must-gather-sk9tl/must-gather-tmwtc"] Jan 31 04:58:28 crc kubenswrapper[4812]: E0131 04:58:28.321159 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="copy" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321174 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="copy" Jan 31 04:58:28 crc kubenswrapper[4812]: E0131 04:58:28.321186 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="registry-server" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321192 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="registry-server" Jan 31 04:58:28 crc kubenswrapper[4812]: E0131 04:58:28.321210 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="gather" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321217 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="gather" Jan 31 04:58:28 crc kubenswrapper[4812]: E0131 04:58:28.321229 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="extract-utilities" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321236 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="extract-utilities" Jan 31 04:58:28 crc kubenswrapper[4812]: E0131 04:58:28.321253 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="extract-content" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321260 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="extract-content" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321378 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="23007872-a429-42af-a5ce-72eff51d7d83" containerName="registry-server" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321393 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="gather" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.321402 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="ed73d7df-99ea-46db-8806-29efceb440f8" containerName="copy" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.323852 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.327056 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sk9tl"/"kube-root-ca.crt" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.333506 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-must-gather-sk9tl"/"openshift-service-ca.crt" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.337521 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sk9tl/must-gather-tmwtc"] Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.490341 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45rbl\" (UniqueName: \"kubernetes.io/projected/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-kube-api-access-45rbl\") pod \"must-gather-tmwtc\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.491267 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-must-gather-output\") pod \"must-gather-tmwtc\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.592529 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-45rbl\" (UniqueName: \"kubernetes.io/projected/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-kube-api-access-45rbl\") pod \"must-gather-tmwtc\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.592932 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-must-gather-output\") pod \"must-gather-tmwtc\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.593353 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-must-gather-output\") pod \"must-gather-tmwtc\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.625173 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-45rbl\" (UniqueName: \"kubernetes.io/projected/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-kube-api-access-45rbl\") pod \"must-gather-tmwtc\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.650853 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 04:58:28 crc kubenswrapper[4812]: I0131 04:58:28.917934 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-sk9tl/must-gather-tmwtc"] Jan 31 04:58:29 crc kubenswrapper[4812]: I0131 04:58:29.649663 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" event={"ID":"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d","Type":"ContainerStarted","Data":"65d4ae136ea66a0d2cc3e579fbbae81a420a28e00e79073a4b6b2a47d4d8f232"} Jan 31 04:58:29 crc kubenswrapper[4812]: I0131 04:58:29.649919 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" event={"ID":"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d","Type":"ContainerStarted","Data":"3c4a6eed0038b3751d909441d4da5c1d2483540beb1638e94188510310004c17"} Jan 31 04:58:29 crc kubenswrapper[4812]: I0131 04:58:29.649934 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" event={"ID":"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d","Type":"ContainerStarted","Data":"7e7b4499dbecb901eabaf87e8bdf6ae852e67a08ad651cea64033507ce7daa9a"} Jan 31 04:58:29 crc kubenswrapper[4812]: I0131 04:58:29.665870 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" podStartSLOduration=1.665828087 podStartE2EDuration="1.665828087s" podCreationTimestamp="2026-01-31 04:58:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 04:58:29.66556031 +0000 UTC m=+1918.160581985" watchObservedRunningTime="2026-01-31 04:58:29.665828087 +0000 UTC m=+1918.160849762" Jan 31 04:58:43 crc kubenswrapper[4812]: I0131 04:58:43.405076 4812 scope.go:117] "RemoveContainer" containerID="d070d16b56cd0e6437b9a5ac7f3125f46083e750c6f5e1a070452d2f652d4e2e" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.485321 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/util/0.log" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.627752 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/util/0.log" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.628507 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/pull/0.log" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.672312 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/pull/0.log" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.836929 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/util/0.log" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.855586 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/extract/0.log" Jan 31 04:59:08 crc kubenswrapper[4812]: I0131 04:59:08.864304 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_920b3933541dd54eb27cdc8c5dcad58318a776ec0e7a3ec14a5289a926pg8ld_398de10e-71b7-41cb-ac1c-b2d5f8fffa75/pull/0.log" Jan 31 04:59:09 crc kubenswrapper[4812]: I0131 04:59:09.011596 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-controller-manager-5d479c78f-kf6sb_500efc8e-b639-4788-833f-3cb9189e1009/manager/0.log" Jan 31 04:59:09 crc kubenswrapper[4812]: I0131 04:59:09.062743 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openstack-operators_horizon-operator-index-dggq6_bdfad5bb-7984-4248-9598-0319bb4543e0/registry-server/0.log" Jan 31 04:59:13 crc kubenswrapper[4812]: E0131 04:59:13.308907 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 04:59:13 crc kubenswrapper[4812]: E0131 04:59:13.309595 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 05:01:15.309560689 +0000 UTC m=+2083.804582364 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 04:59:13 crc kubenswrapper[4812]: E0131 04:59:13.308956 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 04:59:13 crc kubenswrapper[4812]: E0131 04:59:13.309703 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 05:01:15.309682802 +0000 UTC m=+2083.804704477 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 04:59:23 crc kubenswrapper[4812]: I0131 04:59:23.351585 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_control-plane-machine-set-operator-78cbb6b69f-qlf2j_e460d967-199b-41b2-a198-3acaaa1f4382/control-plane-machine-set-operator/0.log" Jan 31 04:59:23 crc kubenswrapper[4812]: I0131 04:59:23.549741 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rm7wz_e5c2893c-e678-4c5e-8692-8d50c2510ded/machine-api-operator/0.log" Jan 31 04:59:23 crc kubenswrapper[4812]: I0131 04:59:23.557984 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-machine-api_machine-api-operator-5694c8668f-rm7wz_e5c2893c-e678-4c5e-8692-8d50c2510ded/kube-rbac-proxy/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.581107 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-hldjr_b7380313-059d-4437-a3dc-371ce0a51fc3/kube-rbac-proxy/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.610096 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_controller-6968d8fdc4-hldjr_b7380313-059d-4437-a3dc-371ce0a51fc3/controller/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.751004 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.919754 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.939449 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.946617 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:59:53 crc kubenswrapper[4812]: I0131 04:59:53.966575 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.087569 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.094324 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.098306 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.159868 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.305672 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-frr-files/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.317640 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-metrics/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.323916 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/cp-reloader/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.354038 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/controller/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.512920 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/kube-rbac-proxy/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.517680 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/kube-rbac-proxy-frr/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.518409 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/frr-metrics/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.687067 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/reloader/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.768637 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-webhook-server-7df86c4f6c-kctff_c8c1c067-45c3-4fc8-b656-920d058691ee/frr-k8s-webhook-server/0.log" Jan 31 04:59:54 crc kubenswrapper[4812]: I0131 04:59:54.881280 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-controller-manager-8458b697d8-qwvb2_3e6043c0-61f1-4cb1-a1df-056d81c22ea0/manager/0.log" Jan 31 04:59:55 crc kubenswrapper[4812]: I0131 04:59:55.045548 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_metallb-operator-webhook-server-56b4c6c8df-wjtg6_dfe9ef2d-1841-414a-9645-84d15e3fa9e5/webhook-server/0.log" Jan 31 04:59:55 crc kubenswrapper[4812]: I0131 04:59:55.085080 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_frr-k8s-7qjt7_7d26d631-eae7-43cb-9df7-ef994fbb752d/frr/0.log" Jan 31 04:59:55 crc kubenswrapper[4812]: I0131 04:59:55.107032 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pnrw7_b0653050-2c1a-48f2-9f1b-10ccd0366143/kube-rbac-proxy/0.log" Jan 31 04:59:55 crc kubenswrapper[4812]: I0131 04:59:55.420606 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/metallb-system_speaker-pnrw7_b0653050-2c1a-48f2-9f1b-10ccd0366143/speaker/0.log" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.160963 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56"] Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.162626 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.167603 4812 reflector.go:368] Caches populated for *v1.ConfigMap from object-"openshift-operator-lifecycle-manager"/"collect-profiles-config" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.171007 4812 reflector.go:368] Caches populated for *v1.Secret from object-"openshift-operator-lifecycle-manager"/"collect-profiles-dockercfg-kzf4t" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.182869 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56"] Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.283090 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c83a1c4-bf1b-47d8-848c-f718522b9be0-config-volume\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.283148 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c83a1c4-bf1b-47d8-848c-f718522b9be0-secret-volume\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.283244 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tm94c\" (UniqueName: \"kubernetes.io/projected/4c83a1c4-bf1b-47d8-848c-f718522b9be0-kube-api-access-tm94c\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.384541 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c83a1c4-bf1b-47d8-848c-f718522b9be0-config-volume\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.384589 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c83a1c4-bf1b-47d8-848c-f718522b9be0-secret-volume\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.384664 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-tm94c\" (UniqueName: \"kubernetes.io/projected/4c83a1c4-bf1b-47d8-848c-f718522b9be0-kube-api-access-tm94c\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.385461 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c83a1c4-bf1b-47d8-848c-f718522b9be0-config-volume\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.391285 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c83a1c4-bf1b-47d8-848c-f718522b9be0-secret-volume\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.404290 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-tm94c\" (UniqueName: \"kubernetes.io/projected/4c83a1c4-bf1b-47d8-848c-f718522b9be0-kube-api-access-tm94c\") pod \"collect-profiles-29497260-qjn56\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.482769 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:00 crc kubenswrapper[4812]: I0131 05:00:00.939240 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56"] Jan 31 05:00:01 crc kubenswrapper[4812]: I0131 05:00:01.272275 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" event={"ID":"4c83a1c4-bf1b-47d8-848c-f718522b9be0","Type":"ContainerStarted","Data":"dc8120ac35e08802e83e2daffa7a2e6735dec3697207afe899e956436f1caf1f"} Jan 31 05:00:01 crc kubenswrapper[4812]: I0131 05:00:01.272741 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" event={"ID":"4c83a1c4-bf1b-47d8-848c-f718522b9be0","Type":"ContainerStarted","Data":"f063d5be28db691e6348e04fc5a6e3512a80856fddb0e810769cfb0f633a603a"} Jan 31 05:00:01 crc kubenswrapper[4812]: I0131 05:00:01.296533 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" podStartSLOduration=1.296509887 podStartE2EDuration="1.296509887s" podCreationTimestamp="2026-01-31 05:00:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-01-31 05:00:01.292634323 +0000 UTC m=+2009.787656028" watchObservedRunningTime="2026-01-31 05:00:01.296509887 +0000 UTC m=+2009.791531592" Jan 31 05:00:02 crc kubenswrapper[4812]: I0131 05:00:02.291315 4812 generic.go:334] "Generic (PLEG): container finished" podID="4c83a1c4-bf1b-47d8-848c-f718522b9be0" containerID="dc8120ac35e08802e83e2daffa7a2e6735dec3697207afe899e956436f1caf1f" exitCode=0 Jan 31 05:00:02 crc kubenswrapper[4812]: I0131 05:00:02.291376 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" event={"ID":"4c83a1c4-bf1b-47d8-848c-f718522b9be0","Type":"ContainerDied","Data":"dc8120ac35e08802e83e2daffa7a2e6735dec3697207afe899e956436f1caf1f"} Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.562376 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.631935 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c83a1c4-bf1b-47d8-848c-f718522b9be0-config-volume\") pod \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.632036 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c83a1c4-bf1b-47d8-848c-f718522b9be0-secret-volume\") pod \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.632090 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tm94c\" (UniqueName: \"kubernetes.io/projected/4c83a1c4-bf1b-47d8-848c-f718522b9be0-kube-api-access-tm94c\") pod \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\" (UID: \"4c83a1c4-bf1b-47d8-848c-f718522b9be0\") " Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.633071 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4c83a1c4-bf1b-47d8-848c-f718522b9be0-config-volume" (OuterVolumeSpecName: "config-volume") pod "4c83a1c4-bf1b-47d8-848c-f718522b9be0" (UID: "4c83a1c4-bf1b-47d8-848c-f718522b9be0"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/configmap", VolumeGidValue "" Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.638991 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4c83a1c4-bf1b-47d8-848c-f718522b9be0-secret-volume" (OuterVolumeSpecName: "secret-volume") pod "4c83a1c4-bf1b-47d8-848c-f718522b9be0" (UID: "4c83a1c4-bf1b-47d8-848c-f718522b9be0"). InnerVolumeSpecName "secret-volume". PluginName "kubernetes.io/secret", VolumeGidValue "" Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.639386 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4c83a1c4-bf1b-47d8-848c-f718522b9be0-kube-api-access-tm94c" (OuterVolumeSpecName: "kube-api-access-tm94c") pod "4c83a1c4-bf1b-47d8-848c-f718522b9be0" (UID: "4c83a1c4-bf1b-47d8-848c-f718522b9be0"). InnerVolumeSpecName "kube-api-access-tm94c". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.733277 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-tm94c\" (UniqueName: \"kubernetes.io/projected/4c83a1c4-bf1b-47d8-848c-f718522b9be0-kube-api-access-tm94c\") on node \"crc\" DevicePath \"\"" Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.733327 4812 reconciler_common.go:293] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/4c83a1c4-bf1b-47d8-848c-f718522b9be0-config-volume\") on node \"crc\" DevicePath \"\"" Jan 31 05:00:03 crc kubenswrapper[4812]: I0131 05:00:03.733342 4812 reconciler_common.go:293] "Volume detached for volume \"secret-volume\" (UniqueName: \"kubernetes.io/secret/4c83a1c4-bf1b-47d8-848c-f718522b9be0-secret-volume\") on node \"crc\" DevicePath \"\"" Jan 31 05:00:04 crc kubenswrapper[4812]: I0131 05:00:04.309197 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" event={"ID":"4c83a1c4-bf1b-47d8-848c-f718522b9be0","Type":"ContainerDied","Data":"f063d5be28db691e6348e04fc5a6e3512a80856fddb0e810769cfb0f633a603a"} Jan 31 05:00:04 crc kubenswrapper[4812]: I0131 05:00:04.309595 4812 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="f063d5be28db691e6348e04fc5a6e3512a80856fddb0e810769cfb0f633a603a" Jan 31 05:00:04 crc kubenswrapper[4812]: I0131 05:00:04.309297 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-operator-lifecycle-manager/collect-profiles-29497260-qjn56" Jan 31 05:00:04 crc kubenswrapper[4812]: I0131 05:00:04.398492 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb"] Jan 31 05:00:04 crc kubenswrapper[4812]: I0131 05:00:04.407179 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-operator-lifecycle-manager/collect-profiles-29497215-pf4tb"] Jan 31 05:00:06 crc kubenswrapper[4812]: I0131 05:00:06.349020 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1d5723db-1696-4fe1-a736-756e9bf39115" path="/var/lib/kubelet/pods/1d5723db-1696-4fe1-a736-756e9bf39115/volumes" Jan 31 05:00:10 crc kubenswrapper[4812]: I0131 05:00:10.002662 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/glance-kuttl-tests_openstackclient_bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62/openstackclient/0.log" Jan 31 05:00:14 crc kubenswrapper[4812]: I0131 05:00:14.338123 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 05:00:14 crc kubenswrapper[4812]: I0131 05:00:14.338986 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.348060 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/util/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.443496 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/util/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.468703 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/pull/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.504392 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/pull/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.678918 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/util/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.693208 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/extract/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.697924 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_270996307cd21d144be796860235064b5127c2fcf62ccccd6689c259dc2xsk6_3d8d899b-ae53-4c83-b9fc-5a124d9a5a5a/pull/0.log" Jan 31 05:00:24 crc kubenswrapper[4812]: I0131 05:00:24.881895 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-utilities/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.008247 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-utilities/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.014023 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-content/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.046566 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-content/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.187874 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-content/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.220881 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/extract-utilities/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.457678 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-utilities/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.571351 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-content/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.579260 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_certified-operators-r58bf_1e130501-5caf-49a3-bd51-61ecde347414/registry-server/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.624929 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-utilities/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.694110 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-content/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.850849 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-content/0.log" Jan 31 05:00:25 crc kubenswrapper[4812]: I0131 05:00:25.863455 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/extract-utilities/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.086973 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_marketplace-operator-79b997595-rcfw5_55175f00-9682-4c72-a26a-3b050c99af46/marketplace-operator/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.165026 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-utilities/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.182469 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_community-operators-jxdsd_567d2603-f5cd-4e06-a78d-d0ad581f7d3f/registry-server/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.319500 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-content/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.327682 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-content/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.337916 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-utilities/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.474499 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-utilities/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.495919 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/extract-content/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.547158 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/redhat-operators-bkx6n"] Jan 31 05:00:26 crc kubenswrapper[4812]: E0131 05:00:26.547385 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="4c83a1c4-bf1b-47d8-848c-f718522b9be0" containerName="collect-profiles" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.547398 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="4c83a1c4-bf1b-47d8-848c-f718522b9be0" containerName="collect-profiles" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.547502 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="4c83a1c4-bf1b-47d8-848c-f718522b9be0" containerName="collect-profiles" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.548257 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.560410 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bkx6n"] Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.577664 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-marketplace-km6lg_663ce6b5-46cb-45ca-9e5c-9ef13d78189f/registry-server/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.677968 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-utilities/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.687365 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-catalog-content\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.687419 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9qpq\" (UniqueName: \"kubernetes.io/projected/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-kube-api-access-v9qpq\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.687646 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-utilities\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.789217 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-catalog-content\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.789275 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-v9qpq\" (UniqueName: \"kubernetes.io/projected/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-kube-api-access-v9qpq\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.789363 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-utilities\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.789790 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-catalog-content\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.789813 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-utilities\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.809678 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9qpq\" (UniqueName: \"kubernetes.io/projected/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-kube-api-access-v9qpq\") pod \"redhat-operators-bkx6n\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.858312 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-utilities/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.866912 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.890583 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-content/0.log" Jan 31 05:00:26 crc kubenswrapper[4812]: I0131 05:00:26.916700 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-content/0.log" Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.081106 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/redhat-operators-bkx6n"] Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.146180 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-utilities/0.log" Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.186925 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/extract-content/0.log" Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.406709 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-marketplace_redhat-operators-9l62r_15084025-2b02-454c-9b65-e2e943d80e39/registry-server/0.log" Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.456109 4812 generic.go:334] "Generic (PLEG): container finished" podID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerID="4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622" exitCode=0 Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.456346 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerDied","Data":"4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622"} Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.456725 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerStarted","Data":"bd080be4acb39ac026f9fc2046cec6acd2a2f1cbfeddca95525f04141385664a"} Jan 31 05:00:27 crc kubenswrapper[4812]: I0131 05:00:27.457614 4812 provider.go:102] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Jan 31 05:00:28 crc kubenswrapper[4812]: I0131 05:00:28.463737 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerStarted","Data":"fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889"} Jan 31 05:00:29 crc kubenswrapper[4812]: I0131 05:00:29.471347 4812 generic.go:334] "Generic (PLEG): container finished" podID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerID="fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889" exitCode=0 Jan 31 05:00:29 crc kubenswrapper[4812]: I0131 05:00:29.471402 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerDied","Data":"fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889"} Jan 31 05:00:30 crc kubenswrapper[4812]: I0131 05:00:30.478565 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerStarted","Data":"e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90"} Jan 31 05:00:36 crc kubenswrapper[4812]: I0131 05:00:36.867620 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:36 crc kubenswrapper[4812]: I0131 05:00:36.868344 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:37 crc kubenswrapper[4812]: I0131 05:00:37.928885 4812 prober.go:107] "Probe failed" probeType="Startup" pod="openshift-marketplace/redhat-operators-bkx6n" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="registry-server" probeResult="failure" output=< Jan 31 05:00:37 crc kubenswrapper[4812]: timeout: failed to connect service ":50051" within 1s Jan 31 05:00:37 crc kubenswrapper[4812]: > Jan 31 05:00:43 crc kubenswrapper[4812]: I0131 05:00:43.486120 4812 scope.go:117] "RemoveContainer" containerID="5253eb5a05fb0be7a9c86b4254bcd7ed80c9c6440c7c6aed5f218e666951e399" Jan 31 05:00:44 crc kubenswrapper[4812]: I0131 05:00:44.337952 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 05:00:44 crc kubenswrapper[4812]: I0131 05:00:44.338254 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 05:00:46 crc kubenswrapper[4812]: I0131 05:00:46.938591 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:46 crc kubenswrapper[4812]: I0131 05:00:46.960554 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/redhat-operators-bkx6n" podStartSLOduration=18.508350179 podStartE2EDuration="20.960529664s" podCreationTimestamp="2026-01-31 05:00:26 +0000 UTC" firstStartedPulling="2026-01-31 05:00:27.457408021 +0000 UTC m=+2035.952429686" lastFinishedPulling="2026-01-31 05:00:29.909587476 +0000 UTC m=+2038.404609171" observedRunningTime="2026-01-31 05:00:30.499983948 +0000 UTC m=+2038.995005603" watchObservedRunningTime="2026-01-31 05:00:46.960529664 +0000 UTC m=+2055.455551359" Jan 31 05:00:47 crc kubenswrapper[4812]: I0131 05:00:47.007172 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:47 crc kubenswrapper[4812]: I0131 05:00:47.186368 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bkx6n"] Jan 31 05:00:48 crc kubenswrapper[4812]: I0131 05:00:48.617074 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/redhat-operators-bkx6n" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="registry-server" containerID="cri-o://e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90" gracePeriod=2 Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.023667 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.123824 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-catalog-content\") pod \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.123892 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-utilities\") pod \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.123940 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9qpq\" (UniqueName: \"kubernetes.io/projected/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-kube-api-access-v9qpq\") pod \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\" (UID: \"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c\") " Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.125559 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-utilities" (OuterVolumeSpecName: "utilities") pod "d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" (UID: "d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.131985 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-kube-api-access-v9qpq" (OuterVolumeSpecName: "kube-api-access-v9qpq") pod "d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" (UID: "d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c"). InnerVolumeSpecName "kube-api-access-v9qpq". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.226057 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.226106 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-v9qpq\" (UniqueName: \"kubernetes.io/projected/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-kube-api-access-v9qpq\") on node \"crc\" DevicePath \"\"" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.262763 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" (UID: "d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.327258 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.627923 4812 generic.go:334] "Generic (PLEG): container finished" podID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerID="e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90" exitCode=0 Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.627980 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/redhat-operators-bkx6n" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.627983 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerDied","Data":"e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90"} Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.628132 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/redhat-operators-bkx6n" event={"ID":"d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c","Type":"ContainerDied","Data":"bd080be4acb39ac026f9fc2046cec6acd2a2f1cbfeddca95525f04141385664a"} Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.628165 4812 scope.go:117] "RemoveContainer" containerID="e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.652023 4812 scope.go:117] "RemoveContainer" containerID="fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.686965 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/redhat-operators-bkx6n"] Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.692050 4812 scope.go:117] "RemoveContainer" containerID="4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.696206 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/redhat-operators-bkx6n"] Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.721797 4812 scope.go:117] "RemoveContainer" containerID="e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90" Jan 31 05:00:49 crc kubenswrapper[4812]: E0131 05:00:49.730405 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90\": container with ID starting with e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90 not found: ID does not exist" containerID="e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.730455 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90"} err="failed to get container status \"e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90\": rpc error: code = NotFound desc = could not find container \"e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90\": container with ID starting with e97ff36e726300531ab9e544c40ae62d1bcb3fd1abfbfc63da55c6c7f2126b90 not found: ID does not exist" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.730483 4812 scope.go:117] "RemoveContainer" containerID="fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889" Jan 31 05:00:49 crc kubenswrapper[4812]: E0131 05:00:49.731538 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889\": container with ID starting with fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889 not found: ID does not exist" containerID="fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.731593 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889"} err="failed to get container status \"fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889\": rpc error: code = NotFound desc = could not find container \"fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889\": container with ID starting with fbc75c945d8c0a0e71e266ac41abeb203cdbd88052e0aaff1b5ef6504c368889 not found: ID does not exist" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.731631 4812 scope.go:117] "RemoveContainer" containerID="4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622" Jan 31 05:00:49 crc kubenswrapper[4812]: E0131 05:00:49.732273 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622\": container with ID starting with 4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622 not found: ID does not exist" containerID="4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622" Jan 31 05:00:49 crc kubenswrapper[4812]: I0131 05:00:49.732312 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622"} err="failed to get container status \"4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622\": rpc error: code = NotFound desc = could not find container \"4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622\": container with ID starting with 4514ad489383eb2ca3d44c06e72bf7ee4e48488c301229c58e6f94b4b283e622 not found: ID does not exist" Jan 31 05:00:50 crc kubenswrapper[4812]: I0131 05:00:50.352125 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" path="/var/lib/kubelet/pods/d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c/volumes" Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.338582 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.339416 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.352819 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.354037 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"b5e1e557565ae6e023f06823ab0a0b4fc2a87a2bc9f18d7758a155c3082c87ee"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.354158 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://b5e1e557565ae6e023f06823ab0a0b4fc2a87a2bc9f18d7758a155c3082c87ee" gracePeriod=600 Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.904522 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="b5e1e557565ae6e023f06823ab0a0b4fc2a87a2bc9f18d7758a155c3082c87ee" exitCode=0 Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.905508 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"b5e1e557565ae6e023f06823ab0a0b4fc2a87a2bc9f18d7758a155c3082c87ee"} Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.905593 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerStarted","Data":"e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038"} Jan 31 05:01:14 crc kubenswrapper[4812]: I0131 05:01:14.905636 4812 scope.go:117] "RemoveContainer" containerID="1477d63f031eb054b3c1b14f223413b258ed874b494cb407766f3fc2fb3786b3" Jan 31 05:01:15 crc kubenswrapper[4812]: E0131 05:01:15.379464 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 05:01:15 crc kubenswrapper[4812]: E0131 05:01:15.379566 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 05:03:17.379541071 +0000 UTC m=+2205.874562776 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 05:01:15 crc kubenswrapper[4812]: E0131 05:01:15.380145 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 05:01:15 crc kubenswrapper[4812]: E0131 05:01:15.380244 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 05:03:17.380224569 +0000 UTC m=+2205.875246264 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 05:01:37 crc kubenswrapper[4812]: I0131 05:01:37.078762 4812 generic.go:334] "Generic (PLEG): container finished" podID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerID="3c4a6eed0038b3751d909441d4da5c1d2483540beb1638e94188510310004c17" exitCode=0 Jan 31 05:01:37 crc kubenswrapper[4812]: I0131 05:01:37.078858 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" event={"ID":"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d","Type":"ContainerDied","Data":"3c4a6eed0038b3751d909441d4da5c1d2483540beb1638e94188510310004c17"} Jan 31 05:01:37 crc kubenswrapper[4812]: I0131 05:01:37.079754 4812 scope.go:117] "RemoveContainer" containerID="3c4a6eed0038b3751d909441d4da5c1d2483540beb1638e94188510310004c17" Jan 31 05:01:37 crc kubenswrapper[4812]: I0131 05:01:37.964919 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk9tl_must-gather-tmwtc_5cebb6d6-c967-4ce1-8737-89b1c7d6af9d/gather/0.log" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.454733 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/certified-operators-52hb9"] Jan 31 05:01:46 crc kubenswrapper[4812]: E0131 05:01:46.455807 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="extract-utilities" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.455825 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="extract-utilities" Jan 31 05:01:46 crc kubenswrapper[4812]: E0131 05:01:46.455872 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="registry-server" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.455886 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="registry-server" Jan 31 05:01:46 crc kubenswrapper[4812]: E0131 05:01:46.455911 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="extract-content" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.455922 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="extract-content" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.456089 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="d8a055be-4dea-49cd-b93c-4c2fc8ff7d6c" containerName="registry-server" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.457201 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.475078 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52hb9"] Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.476318 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-utilities\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.476387 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-catalog-content\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.476668 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxnz4\" (UniqueName: \"kubernetes.io/projected/0951508e-438c-48ee-b03a-b579b3592736-kube-api-access-hxnz4\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.577739 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-utilities\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.577794 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-catalog-content\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.577920 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-hxnz4\" (UniqueName: \"kubernetes.io/projected/0951508e-438c-48ee-b03a-b579b3592736-kube-api-access-hxnz4\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.578395 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-utilities\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.578586 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-catalog-content\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.605321 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxnz4\" (UniqueName: \"kubernetes.io/projected/0951508e-438c-48ee-b03a-b579b3592736-kube-api-access-hxnz4\") pod \"certified-operators-52hb9\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.782944 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.931934 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-sk9tl/must-gather-tmwtc"] Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.932457 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="copy" containerID="cri-o://65d4ae136ea66a0d2cc3e579fbbae81a420a28e00e79073a4b6b2a47d4d8f232" gracePeriod=2 Jan 31 05:01:46 crc kubenswrapper[4812]: I0131 05:01:46.939417 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-sk9tl/must-gather-tmwtc"] Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:47.158452 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk9tl_must-gather-tmwtc_5cebb6d6-c967-4ce1-8737-89b1c7d6af9d/copy/0.log" Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:47.159163 4812 generic.go:334] "Generic (PLEG): container finished" podID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerID="65d4ae136ea66a0d2cc3e579fbbae81a420a28e00e79073a4b6b2a47d4d8f232" exitCode=143 Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:47.217881 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/certified-operators-52hb9"] Jan 31 05:01:48 crc kubenswrapper[4812]: W0131 05:01:47.222822 4812 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0951508e_438c_48ee_b03a_b579b3592736.slice/crio-cdad746e55977525bb1d4467ec6b68e2d2da9406a62451de3752944824517878 WatchSource:0}: Error finding container cdad746e55977525bb1d4467ec6b68e2d2da9406a62451de3752944824517878: Status 404 returned error can't find the container with id cdad746e55977525bb1d4467ec6b68e2d2da9406a62451de3752944824517878 Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.171148 4812 generic.go:334] "Generic (PLEG): container finished" podID="0951508e-438c-48ee-b03a-b579b3592736" containerID="b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88" exitCode=0 Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.171248 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerDied","Data":"b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88"} Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.171467 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerStarted","Data":"cdad746e55977525bb1d4467ec6b68e2d2da9406a62451de3752944824517878"} Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.627372 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk9tl_must-gather-tmwtc_5cebb6d6-c967-4ce1-8737-89b1c7d6af9d/copy/0.log" Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.628027 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.807519 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-must-gather-output\") pod \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.807643 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-45rbl\" (UniqueName: \"kubernetes.io/projected/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-kube-api-access-45rbl\") pod \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\" (UID: \"5cebb6d6-c967-4ce1-8737-89b1c7d6af9d\") " Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.817172 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-kube-api-access-45rbl" (OuterVolumeSpecName: "kube-api-access-45rbl") pod "5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" (UID: "5cebb6d6-c967-4ce1-8737-89b1c7d6af9d"). InnerVolumeSpecName "kube-api-access-45rbl". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.879298 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" (UID: "5cebb6d6-c967-4ce1-8737-89b1c7d6af9d"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.909071 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-45rbl\" (UniqueName: \"kubernetes.io/projected/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-kube-api-access-45rbl\") on node \"crc\" DevicePath \"\"" Jan 31 05:01:48 crc kubenswrapper[4812]: I0131 05:01:48.909113 4812 reconciler_common.go:293] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d-must-gather-output\") on node \"crc\" DevicePath \"\"" Jan 31 05:01:49 crc kubenswrapper[4812]: I0131 05:01:49.181091 4812 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-sk9tl_must-gather-tmwtc_5cebb6d6-c967-4ce1-8737-89b1c7d6af9d/copy/0.log" Jan 31 05:01:49 crc kubenswrapper[4812]: I0131 05:01:49.181756 4812 scope.go:117] "RemoveContainer" containerID="65d4ae136ea66a0d2cc3e579fbbae81a420a28e00e79073a4b6b2a47d4d8f232" Jan 31 05:01:49 crc kubenswrapper[4812]: I0131 05:01:49.181773 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-sk9tl/must-gather-tmwtc" Jan 31 05:01:49 crc kubenswrapper[4812]: I0131 05:01:49.185353 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerStarted","Data":"4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d"} Jan 31 05:01:49 crc kubenswrapper[4812]: I0131 05:01:49.203893 4812 scope.go:117] "RemoveContainer" containerID="3c4a6eed0038b3751d909441d4da5c1d2483540beb1638e94188510310004c17" Jan 31 05:01:50 crc kubenswrapper[4812]: I0131 05:01:50.197993 4812 generic.go:334] "Generic (PLEG): container finished" podID="0951508e-438c-48ee-b03a-b579b3592736" containerID="4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d" exitCode=0 Jan 31 05:01:50 crc kubenswrapper[4812]: I0131 05:01:50.198096 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerDied","Data":"4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d"} Jan 31 05:01:50 crc kubenswrapper[4812]: I0131 05:01:50.348688 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" path="/var/lib/kubelet/pods/5cebb6d6-c967-4ce1-8737-89b1c7d6af9d/volumes" Jan 31 05:01:51 crc kubenswrapper[4812]: I0131 05:01:51.210640 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerStarted","Data":"6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb"} Jan 31 05:01:51 crc kubenswrapper[4812]: I0131 05:01:51.245362 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/certified-operators-52hb9" podStartSLOduration=2.772632148 podStartE2EDuration="5.245329425s" podCreationTimestamp="2026-01-31 05:01:46 +0000 UTC" firstStartedPulling="2026-01-31 05:01:48.17340427 +0000 UTC m=+2116.668425935" lastFinishedPulling="2026-01-31 05:01:50.646101507 +0000 UTC m=+2119.141123212" observedRunningTime="2026-01-31 05:01:51.239401436 +0000 UTC m=+2119.734423151" watchObservedRunningTime="2026-01-31 05:01:51.245329425 +0000 UTC m=+2119.740351130" Jan 31 05:01:56 crc kubenswrapper[4812]: I0131 05:01:56.783270 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:56 crc kubenswrapper[4812]: I0131 05:01:56.784218 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:56 crc kubenswrapper[4812]: I0131 05:01:56.828473 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:57 crc kubenswrapper[4812]: I0131 05:01:57.302496 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:57 crc kubenswrapper[4812]: I0131 05:01:57.354026 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52hb9"] Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.277501 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/certified-operators-52hb9" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="registry-server" containerID="cri-o://6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb" gracePeriod=2 Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.825513 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.979664 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-catalog-content\") pod \"0951508e-438c-48ee-b03a-b579b3592736\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.979799 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxnz4\" (UniqueName: \"kubernetes.io/projected/0951508e-438c-48ee-b03a-b579b3592736-kube-api-access-hxnz4\") pod \"0951508e-438c-48ee-b03a-b579b3592736\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.979934 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-utilities\") pod \"0951508e-438c-48ee-b03a-b579b3592736\" (UID: \"0951508e-438c-48ee-b03a-b579b3592736\") " Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.982266 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-utilities" (OuterVolumeSpecName: "utilities") pod "0951508e-438c-48ee-b03a-b579b3592736" (UID: "0951508e-438c-48ee-b03a-b579b3592736"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:01:59 crc kubenswrapper[4812]: I0131 05:01:59.988462 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0951508e-438c-48ee-b03a-b579b3592736-kube-api-access-hxnz4" (OuterVolumeSpecName: "kube-api-access-hxnz4") pod "0951508e-438c-48ee-b03a-b579b3592736" (UID: "0951508e-438c-48ee-b03a-b579b3592736"). InnerVolumeSpecName "kube-api-access-hxnz4". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.082287 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-hxnz4\" (UniqueName: \"kubernetes.io/projected/0951508e-438c-48ee-b03a-b579b3592736-kube-api-access-hxnz4\") on node \"crc\" DevicePath \"\"" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.082336 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.288902 4812 generic.go:334] "Generic (PLEG): container finished" podID="0951508e-438c-48ee-b03a-b579b3592736" containerID="6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb" exitCode=0 Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.288979 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerDied","Data":"6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb"} Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.289037 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/certified-operators-52hb9" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.289066 4812 scope.go:117] "RemoveContainer" containerID="6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.289039 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/certified-operators-52hb9" event={"ID":"0951508e-438c-48ee-b03a-b579b3592736","Type":"ContainerDied","Data":"cdad746e55977525bb1d4467ec6b68e2d2da9406a62451de3752944824517878"} Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.289248 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "0951508e-438c-48ee-b03a-b579b3592736" (UID: "0951508e-438c-48ee-b03a-b579b3592736"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.319994 4812 scope.go:117] "RemoveContainer" containerID="4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.335983 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/certified-operators-52hb9"] Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.345685 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/certified-operators-52hb9"] Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.355944 4812 scope.go:117] "RemoveContainer" containerID="b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.358830 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0951508e-438c-48ee-b03a-b579b3592736" path="/var/lib/kubelet/pods/0951508e-438c-48ee-b03a-b579b3592736/volumes" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.379495 4812 scope.go:117] "RemoveContainer" containerID="6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb" Jan 31 05:02:00 crc kubenswrapper[4812]: E0131 05:02:00.379884 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb\": container with ID starting with 6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb not found: ID does not exist" containerID="6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.379947 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb"} err="failed to get container status \"6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb\": rpc error: code = NotFound desc = could not find container \"6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb\": container with ID starting with 6a98766670fab7b6a3619018f93b02dd6c9882314ec9324d7e2a63e039cb6ffb not found: ID does not exist" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.379993 4812 scope.go:117] "RemoveContainer" containerID="4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d" Jan 31 05:02:00 crc kubenswrapper[4812]: E0131 05:02:00.380452 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d\": container with ID starting with 4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d not found: ID does not exist" containerID="4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.380488 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d"} err="failed to get container status \"4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d\": rpc error: code = NotFound desc = could not find container \"4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d\": container with ID starting with 4c7fa3afaf6177b55ca8b083f6a102c7a8810e1bd408ce7bee84eab4d514ad9d not found: ID does not exist" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.380515 4812 scope.go:117] "RemoveContainer" containerID="b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88" Jan 31 05:02:00 crc kubenswrapper[4812]: E0131 05:02:00.381195 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88\": container with ID starting with b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88 not found: ID does not exist" containerID="b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.381245 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88"} err="failed to get container status \"b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88\": rpc error: code = NotFound desc = could not find container \"b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88\": container with ID starting with b9f09b30b327988c69d4b2358a11dd5bc4d9bdde5ce946f58fbe1adf740abf88 not found: ID does not exist" Jan 31 05:02:00 crc kubenswrapper[4812]: I0131 05:02:00.386350 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/0951508e-438c-48ee-b03a-b579b3592736-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.139497 4812 kubelet.go:2421] "SyncLoop ADD" source="api" pods=["openshift-marketplace/community-operators-brx6p"] Jan 31 05:02:42 crc kubenswrapper[4812]: E0131 05:02:42.140379 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="gather" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140400 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="gather" Jan 31 05:02:42 crc kubenswrapper[4812]: E0131 05:02:42.140416 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="extract-utilities" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140429 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="extract-utilities" Jan 31 05:02:42 crc kubenswrapper[4812]: E0131 05:02:42.140447 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="registry-server" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140460 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="registry-server" Jan 31 05:02:42 crc kubenswrapper[4812]: E0131 05:02:42.140490 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="copy" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140502 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="copy" Jan 31 05:02:42 crc kubenswrapper[4812]: E0131 05:02:42.140526 4812 cpu_manager.go:410] "RemoveStaleState: removing container" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="extract-content" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140539 4812 state_mem.go:107] "Deleted CPUSet assignment" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="extract-content" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140716 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="0951508e-438c-48ee-b03a-b579b3592736" containerName="registry-server" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140741 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="gather" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.140757 4812 memory_manager.go:354] "RemoveStaleState removing state" podUID="5cebb6d6-c967-4ce1-8737-89b1c7d6af9d" containerName="copy" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.142043 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.160254 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brx6p"] Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.255857 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5tztg\" (UniqueName: \"kubernetes.io/projected/caab77e6-151d-4c54-ac52-6698f4a714b7-kube-api-access-5tztg\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.255946 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-utilities\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.255968 4812 reconciler_common.go:245] "operationExecutor.VerifyControllerAttachedVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-catalog-content\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.357504 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-utilities\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.358184 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-utilities\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.358238 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-catalog-content\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.358482 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-catalog-content\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.358551 4812 reconciler_common.go:218] "operationExecutor.MountVolume started for volume \"kube-api-access-5tztg\" (UniqueName: \"kubernetes.io/projected/caab77e6-151d-4c54-ac52-6698f4a714b7-kube-api-access-5tztg\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.384704 4812 operation_generator.go:637] "MountVolume.SetUp succeeded for volume \"kube-api-access-5tztg\" (UniqueName: \"kubernetes.io/projected/caab77e6-151d-4c54-ac52-6698f4a714b7-kube-api-access-5tztg\") pod \"community-operators-brx6p\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.514033 4812 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:42 crc kubenswrapper[4812]: I0131 05:02:42.766419 4812 kubelet.go:2428] "SyncLoop UPDATE" source="api" pods=["openshift-marketplace/community-operators-brx6p"] Jan 31 05:02:43 crc kubenswrapper[4812]: I0131 05:02:43.603373 4812 generic.go:334] "Generic (PLEG): container finished" podID="caab77e6-151d-4c54-ac52-6698f4a714b7" containerID="92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43" exitCode=0 Jan 31 05:02:43 crc kubenswrapper[4812]: I0131 05:02:43.603419 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brx6p" event={"ID":"caab77e6-151d-4c54-ac52-6698f4a714b7","Type":"ContainerDied","Data":"92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43"} Jan 31 05:02:43 crc kubenswrapper[4812]: I0131 05:02:43.603607 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brx6p" event={"ID":"caab77e6-151d-4c54-ac52-6698f4a714b7","Type":"ContainerStarted","Data":"13426618dc9fd56788d8860909163252d0d95a4bc42bf73df901f4617065914e"} Jan 31 05:02:44 crc kubenswrapper[4812]: I0131 05:02:44.621171 4812 generic.go:334] "Generic (PLEG): container finished" podID="caab77e6-151d-4c54-ac52-6698f4a714b7" containerID="69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904" exitCode=0 Jan 31 05:02:44 crc kubenswrapper[4812]: I0131 05:02:44.621389 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brx6p" event={"ID":"caab77e6-151d-4c54-ac52-6698f4a714b7","Type":"ContainerDied","Data":"69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904"} Jan 31 05:02:45 crc kubenswrapper[4812]: I0131 05:02:45.632925 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brx6p" event={"ID":"caab77e6-151d-4c54-ac52-6698f4a714b7","Type":"ContainerStarted","Data":"7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1"} Jan 31 05:02:45 crc kubenswrapper[4812]: I0131 05:02:45.668216 4812 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-marketplace/community-operators-brx6p" podStartSLOduration=2.157283455 podStartE2EDuration="3.668177436s" podCreationTimestamp="2026-01-31 05:02:42 +0000 UTC" firstStartedPulling="2026-01-31 05:02:43.607568341 +0000 UTC m=+2172.102590006" lastFinishedPulling="2026-01-31 05:02:45.118462282 +0000 UTC m=+2173.613483987" observedRunningTime="2026-01-31 05:02:45.654143939 +0000 UTC m=+2174.149165654" watchObservedRunningTime="2026-01-31 05:02:45.668177436 +0000 UTC m=+2174.163199151" Jan 31 05:02:52 crc kubenswrapper[4812]: I0131 05:02:52.514590 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:52 crc kubenswrapper[4812]: I0131 05:02:52.515356 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:52 crc kubenswrapper[4812]: I0131 05:02:52.591897 4812 kubelet.go:2542] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:52 crc kubenswrapper[4812]: I0131 05:02:52.760273 4812 kubelet.go:2542] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:52 crc kubenswrapper[4812]: I0131 05:02:52.838831 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brx6p"] Jan 31 05:02:54 crc kubenswrapper[4812]: I0131 05:02:54.698164 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-marketplace/community-operators-brx6p" podUID="caab77e6-151d-4c54-ac52-6698f4a714b7" containerName="registry-server" containerID="cri-o://7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1" gracePeriod=2 Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.647098 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.710092 4812 generic.go:334] "Generic (PLEG): container finished" podID="caab77e6-151d-4c54-ac52-6698f4a714b7" containerID="7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1" exitCode=0 Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.710136 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brx6p" event={"ID":"caab77e6-151d-4c54-ac52-6698f4a714b7","Type":"ContainerDied","Data":"7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1"} Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.710196 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-marketplace/community-operators-brx6p" event={"ID":"caab77e6-151d-4c54-ac52-6698f4a714b7","Type":"ContainerDied","Data":"13426618dc9fd56788d8860909163252d0d95a4bc42bf73df901f4617065914e"} Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.710202 4812 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-marketplace/community-operators-brx6p" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.710219 4812 scope.go:117] "RemoveContainer" containerID="7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.739579 4812 scope.go:117] "RemoveContainer" containerID="69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.763025 4812 scope.go:117] "RemoveContainer" containerID="92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.769093 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-catalog-content\") pod \"caab77e6-151d-4c54-ac52-6698f4a714b7\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.769250 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-utilities\") pod \"caab77e6-151d-4c54-ac52-6698f4a714b7\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.769387 4812 reconciler_common.go:159] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5tztg\" (UniqueName: \"kubernetes.io/projected/caab77e6-151d-4c54-ac52-6698f4a714b7-kube-api-access-5tztg\") pod \"caab77e6-151d-4c54-ac52-6698f4a714b7\" (UID: \"caab77e6-151d-4c54-ac52-6698f4a714b7\") " Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.770950 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-utilities" (OuterVolumeSpecName: "utilities") pod "caab77e6-151d-4c54-ac52-6698f4a714b7" (UID: "caab77e6-151d-4c54-ac52-6698f4a714b7"). InnerVolumeSpecName "utilities". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.779612 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/caab77e6-151d-4c54-ac52-6698f4a714b7-kube-api-access-5tztg" (OuterVolumeSpecName: "kube-api-access-5tztg") pod "caab77e6-151d-4c54-ac52-6698f4a714b7" (UID: "caab77e6-151d-4c54-ac52-6698f4a714b7"). InnerVolumeSpecName "kube-api-access-5tztg". PluginName "kubernetes.io/projected", VolumeGidValue "" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.801009 4812 scope.go:117] "RemoveContainer" containerID="7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1" Jan 31 05:02:55 crc kubenswrapper[4812]: E0131 05:02:55.801476 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1\": container with ID starting with 7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1 not found: ID does not exist" containerID="7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.801530 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1"} err="failed to get container status \"7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1\": rpc error: code = NotFound desc = could not find container \"7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1\": container with ID starting with 7c04b4cb0a273d779c32651aaea56fd16e279b73c290022563039889ca14d7a1 not found: ID does not exist" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.801562 4812 scope.go:117] "RemoveContainer" containerID="69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904" Jan 31 05:02:55 crc kubenswrapper[4812]: E0131 05:02:55.801933 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904\": container with ID starting with 69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904 not found: ID does not exist" containerID="69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.801995 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904"} err="failed to get container status \"69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904\": rpc error: code = NotFound desc = could not find container \"69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904\": container with ID starting with 69cc0a92aac6ffbf800668a7be7b275989dd9ff5fbea8d60b6f910a7fead4904 not found: ID does not exist" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.802054 4812 scope.go:117] "RemoveContainer" containerID="92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43" Jan 31 05:02:55 crc kubenswrapper[4812]: E0131 05:02:55.802342 4812 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43\": container with ID starting with 92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43 not found: ID does not exist" containerID="92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.802410 4812 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43"} err="failed to get container status \"92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43\": rpc error: code = NotFound desc = could not find container \"92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43\": container with ID starting with 92c95014e0355a4f4faa4b4f98db5b4ebf8ddaead9090d168647e2510f0c3c43 not found: ID does not exist" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.833914 4812 operation_generator.go:803] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-catalog-content" (OuterVolumeSpecName: "catalog-content") pod "caab77e6-151d-4c54-ac52-6698f4a714b7" (UID: "caab77e6-151d-4c54-ac52-6698f4a714b7"). InnerVolumeSpecName "catalog-content". PluginName "kubernetes.io/empty-dir", VolumeGidValue "" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.871311 4812 reconciler_common.go:293] "Volume detached for volume \"kube-api-access-5tztg\" (UniqueName: \"kubernetes.io/projected/caab77e6-151d-4c54-ac52-6698f4a714b7-kube-api-access-5tztg\") on node \"crc\" DevicePath \"\"" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.871334 4812 reconciler_common.go:293] "Volume detached for volume \"catalog-content\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-catalog-content\") on node \"crc\" DevicePath \"\"" Jan 31 05:02:55 crc kubenswrapper[4812]: I0131 05:02:55.871345 4812 reconciler_common.go:293] "Volume detached for volume \"utilities\" (UniqueName: \"kubernetes.io/empty-dir/caab77e6-151d-4c54-ac52-6698f4a714b7-utilities\") on node \"crc\" DevicePath \"\"" Jan 31 05:02:56 crc kubenswrapper[4812]: I0131 05:02:56.060341 4812 kubelet.go:2437] "SyncLoop DELETE" source="api" pods=["openshift-marketplace/community-operators-brx6p"] Jan 31 05:02:56 crc kubenswrapper[4812]: I0131 05:02:56.069310 4812 kubelet.go:2431] "SyncLoop REMOVE" source="api" pods=["openshift-marketplace/community-operators-brx6p"] Jan 31 05:02:56 crc kubenswrapper[4812]: I0131 05:02:56.352553 4812 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="caab77e6-151d-4c54-ac52-6698f4a714b7" path="/var/lib/kubelet/pods/caab77e6-151d-4c54-ac52-6698f4a714b7/volumes" Jan 31 05:03:14 crc kubenswrapper[4812]: I0131 05:03:14.337999 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 05:03:14 crc kubenswrapper[4812]: I0131 05:03:14.338579 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 05:03:17 crc kubenswrapper[4812]: E0131 05:03:17.405894 4812 configmap.go:193] Couldn't get configMap glance-kuttl-tests/openstack-config: configmap "openstack-config" not found Jan 31 05:03:17 crc kubenswrapper[4812]: E0131 05:03:17.406290 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 05:05:19.406263023 +0000 UTC m=+2327.901284718 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config" (UniqueName: "kubernetes.io/configmap/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : configmap "openstack-config" not found Jan 31 05:03:17 crc kubenswrapper[4812]: E0131 05:03:17.405941 4812 secret.go:188] Couldn't get secret glance-kuttl-tests/openstack-config-secret: secret "openstack-config-secret" not found Jan 31 05:03:17 crc kubenswrapper[4812]: E0131 05:03:17.406408 4812 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret podName:bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62 nodeName:}" failed. No retries permitted until 2026-01-31 05:05:19.406387517 +0000 UTC m=+2327.901409212 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "openstack-config-secret" (UniqueName: "kubernetes.io/secret/bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62-openstack-config-secret") pod "openstackclient" (UID: "bf0f9c46-9e7f-4e5e-832a-5c07ed5e3d62") : secret "openstack-config-secret" not found Jan 31 05:03:44 crc kubenswrapper[4812]: I0131 05:03:44.338760 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 05:03:44 crc kubenswrapper[4812]: I0131 05:03:44.340119 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 05:04:14 crc kubenswrapper[4812]: I0131 05:04:14.338267 4812 patch_prober.go:28] interesting pod/machine-config-daemon-lx2wb container/machine-config-daemon namespace/openshift-machine-config-operator: Liveness probe status=failure output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" start-of-body= Jan 31 05:04:14 crc kubenswrapper[4812]: I0131 05:04:14.338952 4812 prober.go:107] "Probe failed" probeType="Liveness" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" probeResult="failure" output="Get \"http://127.0.0.1:8798/health\": dial tcp 127.0.0.1:8798: connect: connection refused" Jan 31 05:04:14 crc kubenswrapper[4812]: I0131 05:04:14.339017 4812 kubelet.go:2542] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" Jan 31 05:04:14 crc kubenswrapper[4812]: I0131 05:04:14.340364 4812 kuberuntime_manager.go:1027] "Message for Container of pod" containerName="machine-config-daemon" containerStatusID={"Type":"cri-o","ID":"e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038"} pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" containerMessage="Container machine-config-daemon failed liveness probe, will be restarted" Jan 31 05:04:14 crc kubenswrapper[4812]: I0131 05:04:14.340449 4812 kuberuntime_container.go:808] "Killing container with a grace period" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerName="machine-config-daemon" containerID="cri-o://e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038" gracePeriod=600 Jan 31 05:04:14 crc kubenswrapper[4812]: E0131 05:04:14.472796 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 05:04:15 crc kubenswrapper[4812]: I0131 05:04:15.396158 4812 generic.go:334] "Generic (PLEG): container finished" podID="62392df6-29ca-4dfc-b3ab-db13388a43a6" containerID="e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038" exitCode=0 Jan 31 05:04:15 crc kubenswrapper[4812]: I0131 05:04:15.396229 4812 kubelet.go:2453] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" event={"ID":"62392df6-29ca-4dfc-b3ab-db13388a43a6","Type":"ContainerDied","Data":"e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038"} Jan 31 05:04:15 crc kubenswrapper[4812]: I0131 05:04:15.396568 4812 scope.go:117] "RemoveContainer" containerID="b5e1e557565ae6e023f06823ab0a0b4fc2a87a2bc9f18d7758a155c3082c87ee" Jan 31 05:04:15 crc kubenswrapper[4812]: I0131 05:04:15.397618 4812 scope.go:117] "RemoveContainer" containerID="e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038" Jan 31 05:04:15 crc kubenswrapper[4812]: E0131 05:04:15.398000 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6" Jan 31 05:04:30 crc kubenswrapper[4812]: I0131 05:04:30.339950 4812 scope.go:117] "RemoveContainer" containerID="e0da604a725dfc8e8f0dbbed811aaa83a5dd3e3710befd9b69f03bb4000bb038" Jan 31 05:04:30 crc kubenswrapper[4812]: E0131 05:04:30.341010 4812 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"machine-config-daemon\" with CrashLoopBackOff: \"back-off 5m0s restarting failed container=machine-config-daemon pod=machine-config-daemon-lx2wb_openshift-machine-config-operator(62392df6-29ca-4dfc-b3ab-db13388a43a6)\"" pod="openshift-machine-config-operator/machine-config-daemon-lx2wb" podUID="62392df6-29ca-4dfc-b3ab-db13388a43a6"